The Scientist Readers' Survey Methodology
Best Places to Work in Academia 2009
Survey Form: A web-based survey form was posted on the web using Infopoll software from May 5 - July 3, 2009. Results were collected and collated automatically.
Invitations: E-mail invitations were sent to readers of The Scientist and registrants on The Scientist web site who identified themselves as life scientists with a permanent position in an academic, hospital, government, or research organization. Reponses were also solicited via advertising on The Scientist web site and through other electronic promotions.
Responses: 2,355 useable and qualified responses were received. Responses were rejected if the respondent did not identify him or herself as a life scientist with a...
Survey Form: A web-based survey form was posted on the web using Infopoll software from May 5 - July 3, 2009. Results were collected and collated automatically.
Invitations: E-mail invitations were sent to readers of The Scientist and registrants on The Scientist web site who identified themselves as life scientists with a permanent position in an academic, hospital, government, or research organization. Reponses were also solicited via advertising on The Scientist web site and through other electronic promotions.
Responses: 2,355 useable and qualified responses were received. Responses were rejected if the respondent did not identify him or herself as a life scientist with a permanent position in an academic, hospital, government, or research organization, if the respondent's institution was not identified or identifiable, if the response was a duplicate based on e-mail address and other criteria, or if the response showed other signs of unacceptability.
Analysis: Respondents were asked to assess their working environment according to 38 criteria in 8 different areas by posing positive statements with which the respondent was asked to agree or disagree. Answers were scored on a 1 - 5 scale with 5 = "Strongly agree", 1 = "Strongly disagree" and 3 = "Neither agree nor disagree". Respondents were also asked to rank how important each factor was important to them.
Identification of Institutions: As far as possible institutions were identified and names were standardized.
Thresholds: US organizations that received fewer than 5 responses were omitted from the rankings. Non-US organizations with fewer than 4 responses were eliminated. We ranked 119 institutions – 94 from the US and 25 from the rest of the world.
Scoring: Scores for each statement were averaged by institution, country, and institution type.
Ranking: In order to calculate the overall rankings of institutions, we first weighted each factor based on the importance ranking. Because several factors that ranked as important in the United States are valued less elsewhere and vice versa, we used different factor weightings to rank US and non-US institutions. The overall rankings were based on the average score per institution on all factors weighted as described.
Institutions were also ranked based on all factors, unweighted. In addition, we ranked institutions based on unweighted average scores for the 8 categories of statements included in the survey. These categories are:
- Job Satisfaction
- Peers
- Infrastructure and Environment
- Research Resources
- Pay
- Management and Policies
- Teaching and Mentoring
- Tenure and Promotion
Caveats:
- The sample of respondents, while large, was self selected, which may introduce some bias into the results.
- The scoring of results is not standardized and standards may fluctuate between individuals, institutions and countries.
- In some cases, small sample responses may have led to bias in the results.
- No attempt has been made to measure the statistical significance of the results. The difference between, say a 10th ranked and a 20th ranked institution may be insignificant.
Survey Development and Analysis: The survey development and data analysis were carried out by AMG Science Publishing (www.amgpublishing.com).