You are here

Planetary Sciences

Overview: 

The Department of Planetary Sciences/Lunar and Planetary Laboratory offers multidisciplinary programs leading to the Doctor of Philosophy degree with a major in planetary sciences. Students graduating with a Ph.D. in Planetary Sciences from the University of Arizona should have a broad, but quantitative, understanding of the physical, geological, and chemical processes affecting the Sun, planets, moons, and other the objects within the Solar System and similar processes occurring at planets, debris disks, and other objects orbiting other stars.

Students should have applied these skills to produce a body of work for a Ph.D. dissertation that is of publication quality. Furthermore, they should have acquired the necessary skills to give presentations at national and international science conferences and to teach college or graduate-level courses and workshops. In general, graduates should be prepared to embark on careers as planetary scientists in academia, research institutes, or government research centers. Information about the academic requirements is available here: lpl.arizona.edu/graduate/forms.php

Expected Learning Outcomes: 

Upon completion of the Ph.D. program in Planetary Sciences, a student will:

1a. Demonstrate a broad, quantitative understanding of the fundamental processes, including physical, chemical, geological, and biological, related to planetary sciences, broadly defined. Planetary sciences includes, but is not limited to, the science of the Sun, planets and their moons, other Solar  System objects, and objects orbiting other stars.

 1b.  Demonstrate an in-depth understanding of a subfield related to planetary sciences (minor).

2.  Demonstrate ability to design, conduct, and document an independent research project that generates results that can be published in the peer-review literature.

3.  Evaluate the scientific literature essential to the student’s research area and articulate how the student’s research is related to and advances the discipline.

4a. Effectively communicate their research to peers and colleagues (e.g. peer-reviewed literature,    national or international conferences and workshops, and seminars)

4b. Effectively communicate their science to a broader audience, including its basic aspects to a lay  person.

In addition to the main learning outcomes listed above, graduate students in the Department of Planetary Sciences will (likely) also:

 1. Gain sufficient experience to teach a university-level course in their area of expertise

 2. Submit/publish papers to the scientific literature.

 3. Attend scientific conferences, workshops, and seminars to present their research.

 4. Receive external fellowships/research grants/funding.

   5. Establish or participate in intra- and/or extra-mural collaborations.

   6. Successfully mentor junior colleagues and/or research technicians

   7. Successfully compete for a job requiring the skills of a professional scientist.

We note that it is certainly possible for a student to receive a PhD without achieving one, or more, of these complementary program goals. However, for any one of these goals, most of the students who receive a PhD in our program will have achieved this goal.   

Assessment Activities: 

The department has several quantitative ways of assessing whether our students are achieving our expected learning outcomes and complementary program goals. The specific activities are described in detail below.  We note that in addition to these, our graduates’ ultimate career success is possibly the most important measure of assessment. Winning national and international fellowships and awards as students is one measure for current students.  Indeed, this is one of our complementary program goals.  Upon leaving the department, students complete an exit form detailing information about their upcoming job and employer. The department maintains an alumni mailing list, and tracks as many alumni as possible; in fact, alumni and their current positions are listed on the department homepage. Leadership roles in, and awards from, major international planetary sciences organizations, most notably the Division of Planetary Sciences of the American Astronomical Society, are tracked. The department also maintains information regarding student completion rates and demographic information as a measure of success and tool for improvement. Beginning in 2013, the department has conducted "exit interviews" with students leaving the program (with or without a degree).  There is both an online exit interview, discussed further below, as well as an in-person interview with the Assistant Department Head to review the student’s career and experiences in our program, and solicit feedback for possible changes to the program which would benefit future students.

The two tables on the next page show a direct mapping between our points of measurement of our students’ progress (the first column in both tables), to our expected outcomes (the first row in both tables).  The tables separate primary learning outcomes (first table) and complementary program goals (second table), as indicated.  The assessment activities are split into direct assessments which are based on objective, but quantitative, judgments from our faculty at various assessment points, such as committee meetings, comprehensive examinations, etc., as described in detail below, and  indirect assessments, which are self-assessments by the students themselves in the form of exit interviews and alumni surveys.

  Primary Learning Outcomes

Direct Assessments

1a/b

2

3

4a/b

Graduate Student Colloquia

X

X

X

X

Candidacy Examination (Written)

X

 

 

 

Candidacy Examination (Oral)

X

X

X

X

Dissertation Progress Report

X

X

X

X

PhD Dissertation (including defense)

X

X

X

X

Indirect or Self-Assessments

 

 

 

 

Online exit survey

X

X

X

X

Alumni survey

X

X

X

X

  Complementary Program Goals

Direct Assessments

1

2

3

4

5

6

7

GTA Supervisor Performance Evaluation

X

 

 

 

 

 

 

GTA student evaluations (VGE/T)

X

 

 

 

 

 

 

Graduate Student Colloquium

X

 

 

 

 

 

 

Candidacy Examination (Written)

 

 

 

 

 

 

 

Candidacy Examination (Oral)

X

X(a)

X(a)

X(a)

X(a)

X

X

Dissertation Progress Report

X

X(a)

X(a)

X(a)

X(a)

X

X

PhD Dissertation (including defense)

X

X(a)

X(a)

X(a)

X(a)

X

X

Indirect or Self-assessments

 

 

 

 

 

 

 

Online exit survey

X

X

X

X

X

X

X

Alumni survey

X

X

X

X

X

X

X

(a)    Students are asked to provide specific information pertaining to this goal at the exam/defense/DPR meeting

Description of LPL/PTYS points of assessment and relation to learning outcomes/goals:

Graduate Student Colloquia (GSC): These are formal presentations given by the students within their first two years of the program.  The presentations are given to a general audience consisting of (primarily) fellow graduate students, faculty, postdoctoral research associates, and other research staff at LPL/PTYS, but are also open to the public as well.  Each presentation is typically about 20 minutes. The department requires each student to complete three GSC: at least one of which must be a journal-club-style presentation in which the student presents the work published in the scientific literature and provides a critical analysis of its methods, impact and relevance, and conclusions; and at least one of which is a report on the student’s own research.  The student presentations are evaluated by the audience who are given forms to fill out at the end of the presentation.  A sample GSC-evaluation form is appended to this report.  The form contains an evaluation of the students’ speaking style, ability to provide the relevant background and importance of the topic, general quality of the presentation, and the handling of questions from the audience, among other things.  This assessment activity provides important information on all four of the department’s expected learning outcomes listed above, as well as the first of our complementary goals since being capable of explaining their presentation to a broad audience is an important quality of successful teachers.  The results from the audience evaluations are given back to the student to provide feedback, but are also archived as part of the department’s assessment activities.

Candidacy Exam (Written): The written portion of the candidacy exam aims to test our students’ breadth of knowledge within planetary sciences.  The students gain the necessary knowledge by taking graduate core courses required by the department.  There are 8 courses in total, 4 which are related to planetary physics, and two each for planetary chemistry and geology. Students are required to take a minimum of 4 core courses, two of which must be physics related, one chemistry related, and one geology related.  The written exam consists of 3 parts, paralleling the three sub-groups for the core courses.  All three parts are taken in a single day, 3 hours for the physics part, and 3 hours combined for the chemistry/geology parts.  For each of the eight core course, 2 questions are posed so that the complete exam has 16 questions.  The students must answer at least half of the questions for any given part – corresponding to 4 physics questions, 2 chemistry questions, and 2 geology questions.  The exam is administered by the departmental prelim exam committee who also decides on pass/fail.  The written portion of the candidacy exam takes place at the end of the students’ second year in our program; although well-prepared students can take it sooner, if they choose.  The exam results are also tallied by the prelim exam committee which registers the students’ scores for each portion of the exam.  In addition to deciding whether each student passes or fails the written exam, the results are also archived to assess our students’ general breadth of knowledge in planetary sciences (learning outcome #1).

Candidacy Exam (Oral): This portion of the candidacy exam is mostly aimed at judging the student’s ability to conduct research leading to a PhD dissertation.  The student is required to start the exam with a presentation of a science project that they intend to constitute a major portion of their PhD thesis.  They are also required to have a second project ready to discuss, if necessary.  In order to take the exam, the student must prepare a brief written report on these two projects, which the student’s PhD committee will read before the exam.  At the oral exam, after the brief presentation of the science project, a regular oral exam takes place in which the student’s understanding of the project is tested.  To be successful, the student must demonstrate both detailed specific knowledge of the project, as well as its broader, more-fundamental aspects. This latter part presents some overlap with the written exam and requires the student to also have a broad understanding of planetary sciences in general.  The oral-exam committee decides whether the student passes or fails, and provides feedback, if needed.  The committee fills out a form to be used for our program’s assessment activities.  The form includes questions that relate directly to our learning outcomes and goals.  The students are also expected to submit a form with responses to a specific set of questions such whether have published any journal articles, and how many, and whether they have obtained external grant support.  Note this is not a self-assessment, it is only a list of responses to specific questions to which they are best suited to answer.  These forms, which are also used for the Dissertation Progress Report and PhD defense (discussed below), are appended to this report.  The results of this form are not meant to grade the student’s performance in this exam, but are used strictly for our assessment activities and are archived.  As indicated in the tables above, the oral exam provides important information relating to all of our learning outcomes and goals.

Dissertation Progress Report (DPR): This is a meeting between the student and his/her PhD dissertation committee which is required to occur once per year after the oral candidacy exam for 3 years.  After 3 years, a DPR is required to occur every 6 months.  The student makes a brief oral presentation to his/her committee indicating his/her progress towards the PhD.  The committee is required to indicate whether the student is making progress or not; and, separately, each committee member is asked to fill out a form which evaluates the student’s progress towards achieving our department’s learning outcomes and goals. The student also submits responses to specific questions. These forms are the same as those for the oral candidacy exam discussed above.  The results are archived.

PhD dissertation (including defense): In addition to the obvious necessity to write and defend a quality PhD dissertation in order to obtain a PhD in LPL/PTYS, the dissertation also provides an important assessment point for our program.  Each student has a PhD committee which reads the student’s thesis in order to judge its merits.  In addition to deciding whether a PhD is warranted, the committee also provides important assessment information for our program.  Each member of the student’s PhD committee, and the student, is asked to fill out the same form as that used for the oral candidacy exam and DPR which has questions that directly map to all of our expected learning outcomes and goals.  Like the other assessment points, this information is tallied for each student and archived to be used in our assessment analysis.

GTA Supervisor Performance Evaluation: Graduate students in PTYS are required to be a Graduate Teaching Assistant (GTA) for a minimum of two classes taught at the University of Arizona.  This is nearly always for classes taught at the undergraduate level, and nearly always those that we offer as part of our contribution to the General Education program[1].  This provides our graduate students with direct exposure to University undergraduate classes, giving them an excellent opportunity to learn how to teach a University-level course.  It also provides the department with a means of assessing this goal since the instructor of record of these classes, either a tenure-track member of our faculty, or an adjunct instructor, is asked to evaluate each GTA for his/her class.  The form we use for this evaluation is online, and a screen shot of it is appended to the back of this report.  This assessment activity provides important information for our first complementary program goal.

GTA Student Evaluations (VGE/T): In addition to the instructor’s evaluation, the (usually) undergraduate students enrolled in the class also provide an evaluation of each GTA.  This is primarily intended to provide feedback to the GTA since it involves comments, but also provides some quantitative information with which to assess the first of our complementary program goals.  Each student enrolled in the course provides a numeric evaluation of the GTA ranging from Poor to Excellent.  We have found a particularly useful quantitative measure is the ratio of the number of Very Good and Excellent responses to the total number of responses.  This is known as VGE/T score.  This is tabulated and archived for each GTA, for each class, and used as part of our assessment activities.

Online exit survey: When each of our graduate students leaves the program, either with or without a degree, they are asked to fill out an online exit survey depending on whether the student obtained a PhD (lpl.arizona.edu/graduate/exit-1), MSc (lpl.arizona.edu/graduate/exit-2), or did not obtain a degree (lpl.arizona.edu/graduate/exit-3).  The results from this survey provide important quantitative information which directly relates to all of our programs learning outcomes and goals.

Alumni survey: We periodically survey our alumni as well, which also provides a means of assessing how well our program benefited them.  In 2008, the department distributed a survey to graduate alumni, asking questions about their experiences at LPL/PTYS. The questions asked about their thoughts on the number, content, and usefulness of the (now previous) core course policy and electives, and solicited suggestions for improvement.   The survey also asked about their teaching experiences at LPL, and how useful it was for their future endeavors.  Lastly, the survey asked about their feelings on the length of LPL PhDs.  This information played a large part in our discussion which eventually led to the changes of our current core-course policy, as described in our most-recent annual program review.  When we next perform an alumni survey, we will include questions that directly relate to our expected learning outcomes and complementary program goals.

In addition to the specific assessment activities listed above, we also keep close tabs on our graduate students’ general progress towards his/her PhD.  At the end of each semester, students and advisors receive a Student Academic Progress Report detailing academic progress, including grades. The Graduate Admissions and Advising Committee (GAAC) meets with students annually to assess progress in coursework.


[1] PTYS offers both Tier 1 and Tier 2 NATS classes, historically with a frequency of at least 4-5 classes per semester, providing ample opportunities for our graduate students to be GTAs. However, see the discussion under “Changes”.

 

Assessment Findings: 

Below is a list of quantitative results from the 2014/15 through 2016/17 academic years[1]obtained from the assessments discussed in the previous section.  The first subsection summarizes the quantitative results, listing only the averages for all assessment activities.  The second section gives the results separately for each programmatic goal including the results from the specific assessment activity.  The third section lists some general results point to the success of our program.

Summary Tables of Quantitative Results:

The scores given below are based on all assessments, averaged together.  A score of 100% means that the given outcome was judged to be met perfectly. The next section shows the results from the separate assessments. The uncertainties in these numbers are of the order of 15-30%, depending on the specific goal.  The next section gives a more-detailed quantitative analysis including an estimate of the uncertainties.  We note that these numbers are (mostly) from the 2014/15 through 2016/17 academic years, and since we only have ~30 total graduate students, there has not been a substantial number of assessments.  Thus, we are dealing with the statistics of rather small numbers, leading to rather significant uncertainties.  We will continue to perform assessments in the future which will lead to a reduction in the uncertainties.

 

Primary Learning Outcomes

 

Goal 1a

Goal 1b

Goal 2

Goal 3

Goal 4a

Goal 4b

AVG. from all assessments

79%

80%

84%

80%

81%

80%

 

Complementary Program Goals

 

Goal 1

Goal 2

Goal 3

Goal 4

Goal 5

Goal 6

Goal 7

AVG. from all assessments

81%

61%

100%

76%

84%

88%

75%

Detailed List of Quantitative Results

Goal #1a:  Demonstrate a broad, quantitative understanding of the fundamental processes, including physical, chemical, geological, and biological, related to planetary sciences, broadly defined. Planetary sciences includes, but is not limited to, the science of the Sun, planets and their moons, other Solar  System objects, and objects orbiting other stars.

Assessment Activity

Average

Number of Responses

Normalized Score(f)

Unc.(g)

Graduate Student Colloquia

0.77(b)

75

77%

12%

Candidacy Exam (Written)

78%(c)

19

78%

19%(h)

Oral Exams(a)

4.09 (d)

67

82%

13%

Exit Interviews (online)

2.5(e)

12

83%

29%

Alumni Survey

-

-

-

-

 

AVERAGE

79%

 

(a) Includes Candidacy Exam (Oral), Dissertation Progress Reports, and Final PhD defense.

(b) This is the ratio of the number of “Great” and “Good, …” responses divided by the total number for the “Context of the Talk” category (the most relevant to this particular goal) of all GSC surveys discussed in the previous section from Fall 2011 through Spring 2016.

(c) This is the average of the total written exam final scores for exams taking place in 2013 through 2016 (scores for the 2017 exam are not yet available).  The written exam has components related to Physics, Geology, and Chemistry, which are graded separately to produce a total final score, with a maximum of 100%.  A grade of 60% or above is considered passing.

(d) This is the average of the scores on the oral-exam committee surveys for this goal.  The maximum is 5 and the minimum is 0.  “n/a’s” on the surveys are not counted in the average, or the number of responsess.

(e) This is the averages of the responses to the student’s self-assessment of their level of skill for this programmatic goal, for all students completing the online survey students upon exiting our program in the 2014/15 through 2016/17 academic years.  A score of 3 is “expert” and a score of 0 is “no ability”.

(f) This is normalized to a score out of 100, with 100% as the maximum..

(g) Uncertainty based on Poisson statistics, taking the relative uncertainty as (√N)/N, where N is the number of responses.

(h) This is half the range between the maximum and minimum values.

 

Goal #1b:  Demonstrate an in-depth understanding of a subfield related to planetary sciences (minor).

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Graduate Student Colloquia

0.77(a)

75

77%

12%

Candidacy Exam (Written)

78%(c)

19

 

78%

19%(c)

Oral Exams

4.22 (b)

67

84%

12%

Exit Interviews (online)

2.5(a)

12

83%

29%

Alumni Survey

-

-

-

-

 

AVERAGE

80%

 

(a) Same as for Goal #1a.

(b) This is the average of the scores on the oral-exam committee surveys for this goal (5 is maximum).

(c) See equivalent note for Goal #1a

Goal #2:  Demonstrate ability to design, conduct, and document an independent research project that generates results that can be published in the peer-review literature.

Quantitative Findings

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Graduate Student Colloquia

0.85(a)

75

85%

12%

Oral Exams

4.18 (b)

70

84%

12%

Exit Interviews (online)

2.50(c)

12

83%

29%

Alumni Survey

-

-

-

-

 

AVERAGE

84%

 

 (a) This is the ratio of the number of “Great” and “Good, …” responses divided by the total number for the “Structure and Organization” category (the most relevant to this particular goal) of all GSC surveys discussed in the previous section from Fall 2012 through Spring 2017.

(b) This is the average of the scores on the oral-exam committee surveys for this goal (5 is maximum).

(c) This is the averages of the responses to the student’s self-assessment of their level of skill for this programmatic goal, for all students completing the online survey students upon exiting our program in the 2014/15 through 2016/17 academic years.  A score of 3 is “expert” and a score of 0 is “no ability”.

Goal #3:  Evaluate the scientific literature essential to the student’s research area and articulate how the student’s research is related to and advances the discipline.

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Graduate Student Colloquia

0.76 (a)

36

76%

15%

Oral Exams

4.15 (b)

70

83%

12%

Exit Interviews (online)

-(c)

-

-

-

Alumni Survey

-

-

-

-

 

AVERAGE

80%

 

(a) This is the ratio of the number of “Great” and “Good, …” responses divided by the total number for the “Critical Analysis (if talk reviews s journal article)” category (the most relevant to this particular goal) of all GSC surveys discussed in the previous section.

(b) This is the average of the scores on the oral-exam committee surveys for this goal (5 is maximum).

(c) The current version of the online exit survey does not ask the students to self-assess this goal.

 

Goal #4a:  Effectively communicate their research to peers and colleagues (e.g. peer-reviewed literature, national or international conferences and workshops, and seminars)

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Graduate Student Colloquia

0.78 (a)

375(c)

78%

5%

Oral Exams

3.98 (b)

70

80%

12%

Exit Interviews (online)

2.67

9

89%

33%

Alumni Survey

-

-

-

-

 

AVERAGE

81%

 

(a) This is the ratio of the number of “Great” and “Good, …” responses divided by the total number for the following five categories in the GSC surveys: “Student’s Speaking Style”, “Content and Communication”, “Use of Visual Aids”, “Handling of Questions”, and “Decorum and Speaker Mechanics” (these are most relevant to this particular goal).

(b) This is the average of the scores on the oral-exam committee surveys for this goal (5 is maximum).

(c) There were a total of 75 GSC from Fall 2012 – Spring 2017 and there are 5 categories on the survey relevant to this goal.  Thus, there are a total of 75x5 = 375 total entries possible.

 

Goal #4b:  Effectively communicate their science to a broader audience, including its basic aspects to a lay  person.

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Graduate Student Colloquia

0.79 (a)

225(d)

79%

7%

Oral Exams

3.80 (b)

53

76%

14%

Exit Interviews (online)

2.67(c)

9

89%

33%

Alumni Survey

-

-

-

-

 

AVERAGE

80%

 

(a) This is the ratio of the number of “Great” and “Good, …” responses divided by the total number for the following three categories in the GSC surveys: “Student’s Speaking Style”, “Handling of Questions”, and “Decorum and Speaker Mechanics” (these are most relevant to this particular goal).

 (b) This is the average of the scores on the oral-exam committee surveys for this goal (5 is maximum).

 (c) The same as the response for Goal #4a.

 (d) There were a total of 75 GSC between Fall 2012 and Spring 2017 and 3 categories on the survey relevant to this goal.  Thus, there are a total of 75x3 = 225 total entries possible.

Complementary Program Goal #1:  Gain sufficient experience to teach a university-level course in their area of expertise

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

GTA Supervisor Eval.

4.86(a)

48

97%

14%

GTA Student Eval.

0.78 (b)

40

78%

16%

Grad Student Colloquia

0.78(c)

375

78%

5%

Oral Exams

3.94

49

79%

14%

Exit Interviews (online)

2.2(d)

10

73%

32%

Alumni Survey

-

-

-

-

 

AVERAGE

81%

 

(a) This is average of the “Overall Evaluation” scores for each GTA during the 2014-15 through 2016-17 academic years, as scored by the instructor of the course.  (5 is maximum)

(b) This is the ratio of the number of “Very Good” + “Excellent” to the total number of responses based on the GTA evaluations provided by the students in the class, for all GTAs during the 2014-15 through 2016-17 academic years.

(c) This is the same as that for Goal #4a since being able to communicate is a key aspect of being able to teach a University level course.

(d) This is the averages of the responses to the student’s self-assessment of their level of skill for this programmatic goal, for all students completing the online survey students upon exiting our program in the 2014/15 through 2016/17 academic years.  A score of 3 is “expert” and a score of 0 is “no ability”.

Complementary Program Goal #2:  Submit/publish papers to the scientific literature

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Oral Exam (Candidacy + DPR)

55%(a)

58

55%

13%

Oral Exam (PhD defense)

93%(b)

12

93%

29%

Exit Interviews (online)

-(c)

-

-

-

Alumni Survey

-

-

-

-

 

AVERAGE

61%(d)

 

(a) This is the number of “Yes” responses to the total number of responses for students taking either the oral candidacy exam, or a dissertation progress report.

(b) This is the number of “Yes” responses to the total number of responses for students taking the final PhD defense.

(c) The current version of the online survey does not ask the student for this information.

(d)  The total number of “Yes” responses to the total number of responses for students taking all oral exams (candidacy, DPR, and final PhD defense).

Complementary Program Goal #3:  Attend science conferences, workshops, and seminars to present their research

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Oral Exam (Candidacy + DPR)

100%(a)

70

100%

13%

Oral Exam (PhD defense)

100%(b)

12

100%

29%

Exit Interviews (online)

-(c)

-

-

-

Alumni Survey

-

-

-

-

 

AVERAGE

100%(d)

 

 (a) This is the number of “Yes” responses to the total number of responses for students taking either the oral candidacy exam, or a dissertation progress report.

 (b) This is the number of “Yes” responses to the total number of responses for students taking the final PhD defense.

 (c) The current version of the online survey does not ask the student for this information.

 (d)  The total number of “Yes” responses to the total number of responses for students taking all oral exams (candidacy, DPR, and final PhD defense).

Complementary Program Goal #4:  Receive external fellowships/research grants/funding

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Oral Exam (Candidacy + DPR)

74%(a)

58

74%

13%

Oral Exam (PhD defense)

83%(b)

12

83%

29%

Exit Interviews (online)

-(c)

-

-

-

Alumni Survey

-

-

-

-

 

AVERAGE

76%(d)

 

(a) This is the number of “Yes” responses to the total number of responses for students taking either the oral candidacy exam, or a dissertation progress report.

(b) This is the number of “Yes” responses to the total number of responses for students taking the final PhD defense.

(c) The current version of the online survey does not ask the student for this information.

(d)  The total number of “Yes” responses to the total number of responses for students taking all oral exams (candidacy, DPR, and final PhD defense).

Complementary Program Goal #5:  Establish or participate in intra- and/or extra-mural collaborations

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Oral Exam (Candidacy + DPR)

81%(a)

58

81%

13%

Oral Exam (PhD defense)

100%(b)

12

100%

29%

Exit Interviews (online)

-(c)

-

-

-

Alumni Survey

-

-

-

-

 

AVERAGE

84%(d)

 

(a) This is the number of “Yes” responses to the total number of responses for students taking either the oral candidacy exam, or a dissertation progress report.

(b) This is the number of “Yes” responses to the total number of responses for students taking the final PhD defense.

(c) The current version of the online survey does not ask the student for this information.

(d)  The total number of “Yes” responses to the total number of responses for students taking all oral exams (candidacy, DPR, and final PhD defense).

Complementary Program Goal #6:  Successfully mentor junior colleagues and/or research technicians

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Oral Exam(a)

4.41(b)

11

88%

30%

Exit Interviews (online)

-(c)

-

-

-

Alumni Survey

-

-

-

-

 

AVERAGE

88%

          

(a) Includes all oral exams, including candidacy, DPR, and final PhD defense.

(b) This is a score out of 5 maximum possible points, based on responses by members of the exam committee.

(c) The current version of the online survey does not ask the student for this information.

Complementary Program Goal #7:  Successfully compete for a job requiring the skills of a professional scientist

Assessment Activity

Average

Number of Responses

Normalized Score

Unc.

Oral Exam(a)

4.14(b)

62

83%

13%

Exit Interviews (online)

2.0(c)

10

67%

32%

Alumni Survey

-

-

-

-

 

AVERAGE

75%

 

(a) Includes all oral exams, including candidacy, DPR, and final PhD defense.

(b) This is a score out of 5 maximum possible points, based on responses by members of the exam committee.

(c) average of the responses given on online exit survey for students exiting our program during the 2014/15 through 2016/17 academic years.  3=expert, 0=no ability.


[1] Except for the written candidacy exam and the GSC, for which we have data going back to 2012/13.

Other Findings

We list some other useful information pertaining to our assessment activities. From 2012-present (June 2017), all students achieved a 3.0 grade point averages in their major and minor coursework, and no student has left the program during this period because they failed the comprehensive examination, although some chose to take Master's degrees rather than take the candidacy exam. No student has failed the Ph.D. final examination in that period.

From mid-2012 to present (June 2017), LPL students were first author on 284 papers and conference abstracts (this number only includes papers from individuals while they were enrolled as a student).  Thus, on average LPL students are lead authors on 56.8 publications per year. With our average student population of 34 over this period, this rate corresponds to 1.67 first-author publications per student per year. At that rate, a typical student (here for 6 years) would graduate with 10 first-author publications (papers and abstracts). Of course most of those are concentrated in the latter half of a student’s tenure here. In addition, our students are typically co-authors of many other papers, written in collaboration with faculty, other students, and scientists from other institutions.

LPL students are nationally competitive and rewarded for their research before and after graduation. Before completion, approximately 30% of our graduate students hold outside scholarships or fellowships, among other prizes and awards, at any given time. For 2016/2017, 7 (of 32) students had NASA Earth and Space Science Fellowships (NESSFs) and 2 held NSF Graduate Program Research Fellowships. LPL students have also been recipients of NASA Jenkins Fellowships, Sloan Fellowships, Berkner Internships, and Canadian Natural Sciences and Engineering Research Council (NSERC) Fellowships. LPL graduate students have also been successful in earning College of Science prizes as well. Since 2012, of the 18 college-wide outstanding graduate student awards (one each year for scholarship, teaching, and service), LPL students have won four, in competition with more than a dozen other departments, many with more graduate students, each year.

After completing the program, LPL students have been successful in their professional communities. For example, of the ~45 leadership roles in the American Astronomical Society's Division for Planetary Sciences (DPS), 8-12 were filled by LPL graduates at any given time from 2012-2017 (note that several other roles are typically held by LPL faculty and staff). Since 2010, LPL graduates been awarded  three of the four prizes for scientists awarded by DPS (best young scientist, outstanding service, and outstanding communications), the outstanding career award of the American Geophysical Union Planetary Sciences Section, and two of the Meteoritical Society’s four awards (best young scientist and outstanding contributions in impact cratering).

College of Science Outstanding Graduate Student awards to LPL students, 2010-2017

Year

Award

Winner

2012

Scholarship

Nikole Lewis

2013

Scholarship

Kathryn Volk

2014

Teaching

Ali Bramson

2015

Service/Outreach

Jamie Molaro

External awards for LPL alumni, 2010-2016

Year

Organization

Award

Winner

2010

Meteoritical Society

Barringer (impact cratering)

William Hartmann

2010

AAS DPS

Masursky (service)

Mark Sykes

2011

Meteoritical Society

Nier (young scientist)

Fred Ciesla

2011

AAS DPS

Urey (young scientist)

Jonathan Fortney

2014

AAS DPS

Sagan (communications)

Guy Consolmagno

2016

AGU Planetary Sciences Section

Whipple (outstanding career)

John Spencer

 

Change in Response to Findings: 

We began (mostly) collecting assessment data during the 2014/15 academic year; and due to the rather small number of total assessments during this period, the uncertainties in our quantitative assessment numbers is significant.  An initial examination of the findings suggests that we meeting our goals quite well.  As such, the faculty feels that no major changes are needed at this time, but we continue to discuss the findings as more assessment data is collected. Minor changes, and near-term assessment plans, include:

  • Although all students who have taken the written candidacy examination have passed the exam as a whole, some have not had passed in each of the subject areas, raising concerns about how we are meeting Goal #1a, we have changed our policies. If a student passes the exam as a whole, but does not achieve a passing score on one of the portions (Chemistry, Physics, or Geosciences), the student will be required to take at least one elective in the general subject area in which they demonstrated weakness.
  • One potential change that we are monitoring is in our requirements for teaching. We currently require students to act as Teaching Assistants for two semesters, but the exit interview responses we have so far are generally negative about the experience, and while the students judged that they would be able to teach at the college level (Complementary Program Goal #1, average self-assessment 2.2 of 3), the “teaching experience” was rated only slightly higher in relevance than the core courses and the electives, and several students commented that “teaching experience” other than that acquired as a GTA (e.g., in outreach activities) was more valuable. We will include a question on our upcoming alumni survey, and we are exploring other ways to enhance the teaching experience, ranging from serving as a GTA in upper level undergraduate classes (the requirement is currently fulfilled by working in General Education courses) to setting requirements on the faculty for duties to be assigned to the GTAs. In addition, RCM incentives have led many other departments to offer General Education courses, so enrollments in existing courses have decreased, leading to an incentive for us to reduce the number of courses offered.
  • We have found that by far the most common response by faculty to the question about Complementary Program Goal #6 (mentoring junior colleagues and technicians) is “n/a”, indicating that this is not a topic that comes up in the venues in which the questions are being asked (candidacy exams, dissertation committee meetings, and final defenses), so the faculty filling out those forms have no basis for judgment. Hence we will discontinuing use of this question, since very little information is being gathered, and will be adding it to the online exit interview.
  • We are approaching 10 years since our last alumni survey (2008), so we are working toward developing another survey for no later than the 2018-2019 academic year. 

 

AttachmentSize
PDF icon Assessment Forms712.23 KB
PDF icon PTYS/LPL Graduate Program Statistics455.83 KB
Updated date: Sat, 06/03/2017 - 11:38