The Scientist Readers' Survey: Methodology
Best Places to Work in Academia 2010
Survey Form: A web-based survey form was posted on the web using Infopoll software from September 9, 2009 - March 8, 2010. Results were collected and collated automatically.
Invitations: E-mail invitations were sent to readers of The Scientist and registrants on The Scientist web site who identified themselves as life scientists with a permanent position in an academic, hospital, government, or research organization. Reponses were also solicited via advertising on The Scientist web site and through other electronic promotions.
Responses: 2,302 useable and qualified responses were received. Responses were rejected if the respondent did not identify him or herself as a life scientist with a permanent position in an academic, hospital, government, or research organization, if the respondent's institution was not identified or identifiable, if the response was a duplicate based on e-mail address and other criteria, or...
Analysis: Respondents were asked to assess their working environment according to 38 criteria in 8 different areas by posing positive statements with which the respondent was asked to agree or disagree. Answers were scored on a 1 - 5 scale with 5 = "Strongly agree", 1 = "Strongly disagree" and 3 = "Neither agree nor disagree". Respondents were also asked to rank how important each factor was important to them.
Identification of Institutions: As far as possible institutions were identified and names were standardized.
Scoring: Scores for each statement were averaged by institution, country, and institution type.
Factor Analysis: Based on the importance scores given to each factor, we calculated an average importance score for each factor and for each group of factors.
Thresholds: US organizations that received fewer than 5 responses were omitted from the rankings. Non-US organizations with fewer than 4 responses were eliminated. We ranked 119 institutions - 89 from the US and 30 from the rest of the world.
Ranking: In order to calculate the overall rankings of institutions, we first weighted each factor based on the importance ranking. Because several factors that ranked as important in the United States are valued less elsewhere and vice versa, we used different factor weightings to rank US and non-US institutions. The overall rankings were based on the average score per institution on all factors weighted as described.
Institutions were also ranked based on all factors, unweighted.
In addition, we ranked institutions based on unweighted average scores for the 8 categories of statements included in the survey. These categories are:
- Job Satisfaction
- Infrastructure and Environment
- Research Resources
- Management and Policies
- Teaching and Mentoring
- Tenure and Promotion
Results are published in The Scientist, July 2010 issue and are available on The Scientist web site.
The sample of respondents, while large, was self selected, which may introduce some bias into the results.
- The scoring of results is not standardized and standards may fluctuate between individuals, institutions and countries.
- In some cases, small sample responses may have led to bias in the results.
- No attempt has been made to measure the statistical significance of the results. The difference between, say a 10th ranked and a 20th ranked institution may be insignificant.
The survey was developed and responses were analyzed by AMG Science Publishing (www.amgpublishing.com)