Survey Methodology

Survey Form: A web-based survey form was posted from September 9 to November 30, 2009. Results were collected and collated automatically.

Invitations: E-mail invitations were sent to readers of The Scientist and registrants on The Scientist web site who identified themselves as non-tenured life scientists working in academia or other non-commercial research institutions. The survey was also publicized on The Scientist web site and through news stories.

Responses: 3,105 useable and qualified responses were received. Responses were rejected if the respondent did not identify him or herself as a non-tenured scientist working in a non-commercial organization, if the respondent's...

Analysis: Respondents were asked to assess their working environment according to 43 criteria in 11 different areas by posing positive statements with which the respondent was asked to agree or disagree. Answers were scored on a 1 – 5 scale with 5 = "Strongly agree", 1 = "Strongly disagree" and 3 = "Neither agree nor disagree". Respondents were also asked to assess the importance to them of each factor on a 0 to 5 scale. Respondents could also mark a factor as "Not relevant" to them.

Identification of Institutions: As far as possible institutions were identified and names were standardized. Responses from institutions with branches or campuses in multiple locations were lumped together if the campuses were in the same state but treated as separate if they were in different states or countries.

Thresholds: 75 US institutions and 15 non-US institutions that received 5 or more responses were included in the rankings.

Scoring: Scores for each statement were averaged by institution and country.

Ranking: In order to calculate the overall rankings of institutions, we first weighted each factor based on the average importance score. Because several factors that are ranked as important in the USA are ranked as less important outside the USA and vice versa, we used different factor weightings in our ranking of US and non-US institutions. The overall rankings were based on the average score per institution on all factors, weighted as described.

Institutions were also ranked based on all factors, unweighted.

In addition, we ranked institutions based on unweighted average scores for the 11 major topics categories covered by the statements included in the survey. These categories are:

    1. Quality of Training and Mentoring
    2. Career Development Opportunities
    3. Quality of Communication
    4. Networking Opportunities
    5. Value of the Postdoc Experience
    6. Quality of Facilities and Infrastructure
    7. Funding
    8. Equity
    9. Remuneration and Compensation
    10. Benefits
    11. Family and Personal Life 

Results: Results are published in The Scientist, March 2010 issue and are available on The Scientist web site at www.the-scientist.com

Caveats:

  • The sample of respondents, while large, was self selected, which may introduce some bias into the results.
  • The scoring of results is not standardized and standards may fluctuate between individuals, institutions and countries.
  • In some cases, small sample responses may have led to bias in the results.
  • No attempt has been made to measure the statistical significance of the results. The difference between, say a 10th ranked and a 15th ranked institution may be insignificant.

The survey was developed and responses were analyzed by AMG Science Publishing (www.amgpublishing.com)

Click here to view the questions used in the survey.

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!