Beyond Grey Pinstripes 2007-2008 Research Methodology
In the fall of 2006, initial invitations to participate in the 2007 edition of Beyond Grey Pinstripes were sent out to 590 business schools worldwide. Working with the AACSB and other international accrediting associations, only in-person MBA programs with full-time enrollment were considered eligible in this survey cycle. An exhaustive outreach effort took place over the following months to assure that each business school that had not replied to our initial invitation received further information via email and, in many cases, personal phone calls from Aspen Institute Center for Business Education staff.
Schools that met the eligibility criteria and also communicated an interest in participating were given a personalized username and password that granted their staff access to the Beyond Grey Pinstripes online survey site. Pinstripes survey questions have been refined every two years. In 2006, Aspen CBE staff updated questions in consultation with an advisory group of thirty business school faculty. Two online tutorials were offered by Aspen CBE staff at the beginning of the survey period to offer schools an overview of how to use the survey tool. Throughout the data collection period, support was available to all participating schools via email and telephone.
Data Collection / Survey
Data were collected in three broad categories, similar to surveys from prior editions: Coursework, Faculty Research, and Institutional Support. All information collected that was deemed relevant to the mission of the survey is available for viewing on this website.
MBA Coursework: This section focuses on core and elective courses that include social impact management and/or environmental topics. To clarify the context, submission of either supporting syllabi, course descriptions or a URL link was required, with syllabi preferred. The schools were also asked whether the entire course was dedicated to social impact management, environmental impact management or both—or if it contained some "relevant" content—i.e. individual class sessions or modules on pertinent topics. The survey also asked schools to indicate the department, instructor, number of MBA students enrolled in each course, credit hours for each course, and total school enrollment.
Faculty Research: Schools were asked for the names of any faculty members who had conducted research on social or environmental business topics, along with a history of his or her research for calendar years 2005 and 2006. This information was used later to identify the actual articles published.
Institutional support at the school: Institutions were asked to report exemplary non- curricular activities and programs that specifically addressed social impact management and environmental management. Information was requested about external speakers, seminars and conferences, orientation activities, internships and consulting programs, MBA student competitions, clubs and groups, career development services, university institutes and centers, joint degrees, specializations, and other relevant activities. All of this information is available for comparison and review in the database. Information on Institutional Support was not used in calculating the rank of each school.
The schools were also asked to submit a 500-word summary of their program.
In total, over forty thousand pages of data were collected from the 111 Pinstripes schools this year (representing an institutional response rate of 18.8%). 71 institutions are located in the United States , and 40 are located internationally (representing 18 countries).
An attempt was made to “clean” all data for typos and obvious errors. For example, in certain instances, such as if a rural school reported an MBA student enrollment of 10,000, Aspen CBE staff contacted the school in an attempt to correct this obvious mistake. Other than obvious errors, staff at Aspen CBE does not attempt to assess the validity of all self-reported data. To support the transparency and accuracy of data reporting we:
1. Require an online signature from the reporter, pledging honesty and accuracy. Data indicates that such pledges have a real effect on reporting.
2. Make nearly all reported data publicly available so that peer schools, students, and alumni can review reported data.
Scoring / Ranking Calculations:
The ranking is tabulated directly from the self-reported survey data. Only Coursework and Research data were used in the ranking calculations. Institutional activities were cleaned for grammar and spelling, and posted directly to the website.
Some survey metrics are calculated directly from data reported. Others require a subjective review of courses or research. For subjective reviews, Aspen CBE used a team of ten PhD and DBA Research Fellows, selected in a competitive process from leading institutions around the world. Research Fellows were trained carefully on the survey protocol. In any review by Research Fellows, scoring was done “blind” (without school or faculty names associated with data) and in pairs. This was done to obtain inter-rater reliability (consistency) and to minimize biases.
The calculated and reviewed data was aggregated into 4 “raw score” metrics, as follows:
Student Opportunity Raw Score: A simple count of the number of courses that contain social, environmental, and/or ethical content. Research Fellows asked “Is there clear evidence in the course title, description, or syllabus of the presence of social or environmental issues in this course?”
Student Exposure Raw Score: Is a calculated value based on the sum of the Student Exposure for each course that received Student Opportunity credit. Student Exposure answers the question: What proportion of total teaching hours in a given year are dedicated to social and environmental issues? The formula for each course is [(percent of course time dedicated to such issues) x (course length / program length) x (course enrollment / total school enrollment)]. This adjusts for both varying program length and size of the student body. All things equal, longer courses, or dedicated courses, or courses with higher enrollments (whether required or simply popular electives) will count for more.
Course Content: A simple count of the number of courses that not only demonstrate Student Opportunity, but specifically address the intersection of social and environmental issues in mainstream, for profit business. This metric reflects the focus of the Aspen CBE on how business plays a role in society. For example, course content that focused on the nonprofit sector or on a philosophical approach to ethics would get credit in Student Opportunity, but would likely not get credit for Course Content. A Finance course that addresses models for pricing the cost of carbon, would likely get credit for Course Content.
Faculty Research: indicates the number of “author-credits” attributable to school faculty. We searched 80 leading academic journals for all articles published by the faculty each school named in the survey. Journals accepted in the survey are determined prior to data review by a group of thirty faculty advisors and can be reviewed by contacting Aspen CBE staff. We made sure to search for all reported names, as well as for common variants (e.g. Robert Smith and Bob Smith). Research Fellows reviewed all abstracts for matching articles using criteria similar to Student Opportunity. When multiple authors were present, each author represents 1 point for their school.
The “raw score” metrics were adjusted by using a statistical smoothing process called square-root standard deviation about the mean, which produces numerical values that represent how well a school has done, relative to the other schools in the survey. Each of these four values, or z-scores, was weighted at 25% and then summed, to arrive at an overall point total. The final ranking is an ordinal list of the top 100 schools, by total points received.
Did you like what you just read? Please consider sharing it with you friends!