Survey Methodology


A web-based survey form was posted from October 1 to December 3, 2007. Results were collected and collated automatically.

Click here for survey questions


E-mail invitations were sent to readers of The Scientist and registrants on The Scientist web site who identified themselves as non-tenured life scientists working in academia or other non-commercial research institutions. The survey was also publicized on The Scientist web site and through news stories.


3,086 useable and qualified responses were received. Responses were rejected if the respondent did not identify him or herself as a non-tenured scientist working in a non-commercial organization, if the respondent's institution...


Respondents were asked to assess their working environment according to 44 criteria in 11 different areas by posing positive statements with which the respondent was asked to agree or disagree.

Answers were scored on a 1 - 5 scale with 5 = "Strongly agree", 1 = "Strongly disagree" and 3 = "Neither agree nor disagree". Respondents were also asked to assess the importance to them of each factor on a 0 to 3 scale. Respondents could also mark a factor as "Not relevant" to them.


As far as possible, institutions were identified and names were standardized. Responses from institutions with branches or campuses in multiple locations were lumped together if the campuses were in the same state but treated as separate if they were in different states or countries.


82 U.S. institutions and 17 international institutions that received 5 or more responses were included in the rankings.


Scores for each statement were averaged by institution and country.


In order to calculate the overall rankings of institutions, we first weighted each factor based on the average importance score. As several factors that are ranked as important in the U.S. are ranked as less important internationally and vice versa, we used different factor weightings in our ranking of U.S. and international institutions. The overall rankings were based on the average score per institution on all factors, weighted as described.

Institutions were also ranked based on all factors, unweighted. In addition, we ranked institutions based on unweighted average scores for the 11 major topics covered by the statements included in the survey. These categories are:

  1. Quality of Training and Mentoring
  2. Career Development Opportunities
  3. Quality of Communication
  4. Networking Opportunities
  5. Value of the Postdoc Experience
  6. Quality of Facilities and Infrastructure
  7. Funding
  8. Equity
  9. Remuneration and Compensation
  10. Benefits
  11. Family and Personal Life


Results are published in The Scientist, March 2008 issue and are available on The Scientist web site at


  • The sample of respondents, while large, was self selected, which may introduce some bias into the results.
  • The scoring of results is not standardized and standards may fluctuate between individuals, institutions and countries.
  • In some cases, small sample responses may have led to bias in the results.
  • No attempt has been made to measure the statistical significance of the results. The difference between, say a 10th ranked and a 15th ranked institution may be insignificant.


Survey development and data analysis were carried out by AMG Science Publishing (

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!