Part 4: Teaching Policy People to Code at the Harvard Kennedy School
Final Thoughts, Lessons Learned, and Looking to the Future
This summer, digital HKS launched an experimental pilot to help MPP and MPA students at the Harvard Kennedy School of Government (HKS) learn Python.
We started the conversation with a post about why we are doing this, and over the summer we shared more details about this experiment: why we ran it, how we did it, what we hoped to achieve. In this final post, we wanted to discuss some lessons we learned from this summer’s experiment. We’ll provide an overview of the academic outcomes, student reactions to the process, and reflect on some interesting takeaways from all of the work.
About 27% (245) of all incoming (new and returning) public policy and administration students at the Kennedy School expressed some initial interest in joining last summer’s Python pilot at the end of the 2017–2018 academic year. Of these, 143 showed up on the first day of ‘class’ to create an account — a pretty good signal that the interest we heard anecdotally from students indicated real demand. We don’t have a great sense of what led to the pre-summer ‘melt’ in this group, but we heard from a few students that in the rush of logistics in either getting an internship underway or moving to Cambridge to start school, they just lost track of the project. For us, it highlights a potential need for more coordinated communication with folks during this busy transition period — a theme we’ll return to later.
A big challenge in any online learning environment is attrition — it’s hard to keep students engaged without a physical classroom, a one-on-one conversation with a professor, the peer pressures and relationships that emerge with classmates. We installed a few guardrails to help students stay engaged (a Slack channel, a dedicated discussion space in the Canvas LMS, course assistants with live chat windows) but we expected significant attrition nonetheless. From the start of the pilot to the end the number of students completing each successive task fell, with 17% of those who submitted the first assignment sticking it out to write the final exam.
Percentage of Students Attempting Successive Course Modules
(As a share of those attempting initial exercise)
But one *big* caveat for these numbers is that unlike many online courses (which typically have worse attrition rates), we were taking an existing community with diverse interests and giving it access to this resource. A lot of online education platforms are trying to do the opposite (i.e., build a community from scratch through the learning experience), so from the very beginning, we wondered if there might be some better ways to measure the value that our students were getting from this course.
What We Learned
Policy students who joined this summer’s experiment in coding did so for a number of reasons. For many, this was the first time they had ever written a line in Python or any other programming language; for others, it was a quick refresher for skills that had gotten rusty. Understanding the success of such a diverse group is tough — a good experience for the first group might involve very different outcomes than those that would be best for the second group. We had informal conversations with students across the summer about how things were going, but in the spirit of a data science pilot, we also took time at the end of the summer to collect some survey data from participants.
We surveyed students at the end of the summer to learn more about their own personal goals for the course. Of the 143 students who originally signed up for the edX course, 40 completed our survey at the end of the summer about their experience. Of these, 70% were in the Masters in Public Policy (MPP) program, 15% were in the Masters in Public Administration program, and the remainder came from the Mid-Career Masters in Public Administration and joint degree programs.
There was a nearly even split in previous coding experience, with about one-third each grouping into ‘This was my first coding experience’, ‘Less than one year,’ and ‘One to five years’ of background knowledge. However, the distribution experience in previous programming, analytical, and query languages varied a lot. Stata and R were the most common, with about 50% and 35% of participants respectively having some background (unsurprising to us, as some of the core first-year policy courses at HKS require Stata or R). Most notably, only 10% of participants had ever used Python before taking this course.
About half of the students who completed our survey were among those who actually finished the course, so this data is heavily weighted to those who made more progress through the modules. However, it provides some useful insights for us as we move to the future — as well as to other policy programs that might want to try something similar.
Four Archetypes for Students Who Self-Select Into Summer Coding
Based on the coding backgrounds of participants and what they told us they sought and gained through this course, we grouped students into four buckets:
1. Pivot Hard Into Coding
Those students with the most specific purposes for participating in this course were those who are looking to pivot hard into coding as a part of their next career move. Some of these students had a background in query languages (e.g., SQL) and analytical programming languages (e.g., R, Stata). For this group, the most important features they looked for in a summer coding course were learning how to code and developing skills in data science. ‘Success’ probably means that they developed enough of a background in Python to jump right into an advanced data science course this past fall.
2. Think Computationally and Interact with Coders
We believe the largest group of students were those less interested in developing specific coding skills, and more interested in developing a baseline knowledge so as to be more effective managers of coding and engineering teams. About 40 percent of participants told us that they hoped to use this course to ‘develop more confidence in talking to or managing technology staff.’ This group wasn’t interested in getting the best grade on the problem set or taking the next dive into CS 109 or Machine Learning. Instead, they defined success as building a vocabulary and fluency for day to day management of technology teams and/or understand what is technically possible in their respective domain of expertise.
3. Accomplish a Discrete, Functional Task
A small group of students didn’t have a long-term vision for learning Python, but rather saw it as tool to help solve tangible short term objectives. For example, some students in the middle of internships and fellowships found that their summer employers were asking them to do research tasks that would be much faster and more sustainable using a web scraper. Learning some quick Python through this course gave them a leg up to accomplish a key task that helped their organizations function better. We suspect a number of these students dropped out before the end, once that they had found ‘success’ with a minimum viable skill set. We see this group’s efforts as a big success of the model: these students now know how to navigate Python, they know what resources are out there, and when they’re ready to continue learning, they’ll have the right foundation from which to build.
4. Just Curious
Perhaps most interesting was a small group of students who didn’t have any specific goal in mind — they just had a bit of extra time and were simply curious about what this program was. We suspect this group likely struggled the most to complete the course. For this group, we recommend taking more time up front to define their goals and aspirations for the program (see the Community Manager recommendation below). If these students can visualize success and tie the skills in this course to something they care about, they’ll probably be more likely to complete more of the material.
Set Clear Expectations for How Much Time Students Need to Invest
The main thing students wanted and got from this program was a crash-course in rudimentary Python programming. By setting up a Slack channel and creating opportunities for interaction, we hoped to extend the Kennedy School community atmosphere into the summer months. But that idea ran directly into summer’s scarcest resource — time. When students aren’t in a physical learning environment and have summer internships and responsibilities, it can be hard to set aside enough free time to learn a complex new skill. About 75 percent of students told us that a lack of time was the biggest barrier to progress.
Because students didn’t have a clear sense of how much time it would take each week, it was tough for them to map out a schedule — and when things took longer than expected, some students decided they couldn’t afford to keep going. We need to do a better job setting expectations around the time required — particularly with incoming first-year students — if we think that a summer coding component might someday become a prerequisite for more advanced data science and machine learning coursework at the graduate level.
If You Want to Encourage Community, Invest in a Community Manager
We made some modest efforts to foster community among the course participants and nudge them to stay engaged over the summer. We created a Slack channel for group conversations, a Piazza page in our Canvas LMS to manage Q&A between students and the teaching team, and held regular virtual office hours with course assistants to answer students questions and review materials.
But one place we fell short was in helping students feel that this course was a corollary to their classroom experience — and for many, the isolation and distance they felt from other students was a big barrier to success. In their final comments, many participants mentioned that they wanted to interact more with other students.
One hypothesis for mitigating this sense of isolation is to entrust an individual to manage the community and social aspects of the course. This ‘Community Manager’ could help set up physical and virtual study groups, manage conversations between students, and generally try to make sure that the pilot really is a part of the HKS experience and not just a one-off effort. We’ll continue to think on how this role might look. If you’ve implemented a similar position in your program, we’d love to hear what you have learned!
Evaluating Our Initial Hypotheses
Our core questions going into this experiment were:
1. Is there unmet demand to learn coding among students at the Kennedy School?
Answer: Yes. The scale of demand we saw shows that the demand we heard about anecdotally is real — somewhere between 10 and 30 percent of a cohort is . This is about the current demand for our our advanced statistics sections among public policy students.
2. Can we piggyback on existing, open-source materials to help our students learn?
Answer: Yes — but some questions remain. We opted to go with an edX course through MIT for this pilot, and we chose it in part because although it was relatively difficult, we knew it was a strong, comprehensive introduction to the topic. Anecdotally, we know that some of our students struggled with the material. Moving forward, we might think about experimenting with different curricula; however, the fact that a good number of our students successfully made it well into the material shows there’s a real opportunity here.
3. Can we replicate the success of most MOOCs with a very modest budget?
Answer: Yes. The success of online courses varies a lot across disciplines, platforms, and source content (you can compare some of the figures yourself with this handy tool). When we compare our completion numbers to what we think are representative ‘peers,’ we do pretty well (e.g., the certificate completion rate for Harvard’s online CS50 is 1 percent).
4. Can we outperform MOOCs with a modest investment in community engagement?
Answer: The jury is still out. This is an area where we learned a lot — and while we modestly outperformed most MOOCs, we think that investing more in community engagement is a key area for future improvement. This is particularly true as we think about how to use summer coursework to augment our core curriculum — the more we can extend the HKS community into the time between semesters, the better we will be at making sure that real, productive learning can take place outside our traditional focus areas.
5. Can we develop some new muscle at the Kennedy School in utilizing summer instruction, especially to teach new incoming students (Harvard Business School being one example of a school that does this well)?
Answer: Yes — but doing it effectively will require more dedicated staff and new skills around community management and engagement. We put together a great distributed team to help students move through their Python modules across the summer, but if we were doing this with a group at scale, it would mean providing a lot more remote support.
In the rearview, the inaugural HKS experimental python pilot hit a lot of the goals that we set for it. We were able to engage a big group of graduate students from across Kennedy School academic programs, get them all into a shared learning platform, and sustain engagement for a sizable group across the summer. Many students reported that the course was effective in helping them achieve the specific goals they had set for this experience — learning the basics of Python, getting an introduction to data science principles, understanding enough of programming to accomplish a particular life goal.
But we also learned that there are big challenges for a program like this one in creating broad, sustainable success. Students struggle to find time for Python lessons in between fellowships in state governors’ offices and stints in international NGOs. Without organic ways to create community or a dedicated Community Manager to move things along, many felt isolated and disconnected from the rest of the students in the course.
What can we all take away for the future? First, you need to segment how you define program goals and successes to match the diversity of students that join your program. An experienced but rusty programmer is going to want very different things than someone who has never typed in command line — and that means they may also need a different kind of support. Second, set clear expectations (for students and the teaching team) about how much time is required for students to really be successful in moving through complex content on their own. And finally, be intentional about how you define and create a community space for online learners.
Alone around the world, many students will feel disconnected from their peers and their goals, and struggle to gain value from a new course. But if you can develop a nurturing and productive learning space that helps bridge the gap between spring and fall semester, a summer coding experience may someday become a major pillar of what students take away from their graduate school experience.