npm, Inc.
npm, Inc.
Apr 19 · 7 min read

The 2018/2019 JavaScript ecosystem survey was run from December 1, 2018 to January 8, 2019. Questions were formulated by npm, in consultation with domain experts on the exact language and answers for specific questions. The survey was announced via email to over 800,000 registered users of, package pages on, and Twitter accounts of npm and its employees, as well as the websites and Twitter accounts of many ecosystem partners to ensure widest possible reach.

The goal of the survey was to learn more about the JavaScript community’s needs and behavior and to help npm and others in the npm community make better technical choices and serve their users better. The survey received 33,478 responses.

What we collected and did not collect

The survey had 54 questions ranging from basic demographics (see below) to in-depth questions about tooling choices, technical preferences, and attitudes towards various professional practices. The survey was open-ended, meaning users did not need to reach the end of the survey, and all questions were optional.

While we asked questions about how much experience developers have had, we did not ask respondents how old they were. Similarly, we did not collect demographics on race or gender identity. The goals of the survey did not justify collecting this data.

We did collect information on several demographic topics relevant to our goals:


We asked respondents “What country do you live in?” Respondents from a total of 194 countries and territories responded. The top 30 countries are in the table below.

It is hard to measure exactly the distribution of npm users around the world, so while some countries are over-represented relative to their population, it may be that this is still a relatively accurate measure of npm’s user base worldwide. We attempted to locate any bias in this sample by comparing these answers to a different source of npm usage: web traffic to, as measured by Google Analytics.

Relative to the web traffic, there are some meaningful differences: Germany is #2 here but #5 in web traffic, China is #16 here but #4 on the web, and Japan is #26 here but #12 on the web. In the latter two cases it seems likely that the survey being available only in English was a factor, but we are not sure why residents of Germany would be more likely to answer a survey.


We asked respondents their primary spoken language. Twenty-two languages were represented. The survey was only in English and this changed the demographics of who responded. In particular (see above), we believe it under-samples China and Japan. The 16% answering “other” languages suggests our list of options can be improved, see below. Our results are definitely biased towards English speakers.


We asked respondents if they are currently in full-time education. Respondents answering “yes” to this question were excluded from questions (see below) about company size and industry.

We asked all respondents including those still in education for the highest level of education they had completed. These results are in line with broader measures of education level, so we do not believe there is any bias on this axis.

Company size

Respondents were asked how many people worked at their employer, or if they were not currently working. The distribution of answers matches census data on company size, so we believe our results are not biased towards any particular size of company.


We asked respondents what industry they work in. In our previous survey “technology” dominated. We are aware that many companies, while being tech companies, are also part of one or more specific industries: for instance, Google is a tech company that is also in advertising, media, and arguably other industries. Airbnb is a tech company that is also in real estate. To attempt to correct for this, we first asked respondents whether the company they work for is considered a “tech” company:

We then had a second question which asked respondents to say what industries their company was in. Multiple industries were allowed, and the “tech” option was labelled as “It really is just tech, no specific industry”. Relative to last year’s results this produced a much wider range of responses, much less concentrated in tech.

Number of respondents

Of 33,478 responses collected, 25,034 were considered “complete” responses, meaning the respondent reached the final page of the survey and indicated they were finished answering. For questions that were answered both by respondents who completed the survey and those who did not, we compared results and did not find statistically meaningful differences between the groups, so we have included data from incomplete responses in the findings.

However, because questions later in the survey were more likely to be skipped by respondents who abandoned the survey, total respondents for those questions were lower. The lowest number of responses for a question give to all respondents was 24,781.

Some questions were asked conditionally based on previous responses, e.g. users who said they do not use TypeScript were not asked how often they use it. These questions have lower numbers of respondents for that reason, but we believe this is more likely to increase than decrease their accuracy.

Differences from 2017/2018 survey

We learned a great deal from our 2017/2018 survey, run from December 2017 to January 2018, and this was reflected in changes to our survey this year.

Desktop, mobile, and native apps

In last year’s survey our questions were ambiguous as to what was meant by “mobile”: does a mobile web app count, or a native mobile app? This year we asked multiple unambiguous questions about developers who write code “for browsers”, “for servers”, “native apps” and “embedded devices” and included options for “I do not write this kind of code” for all. This gives us useful, unambiguous numbers of who is writing what types of apps where.


As mentioned above, our question about Industry last year included several ambiguities: the unemployed, those in full-time education and companies that compete in multiple industries were given unclear options to respond. We believe our separate clarifying questions this year have done much to resolve these ambiguities.


Last year our list of web frameworks was smaller, so we expanded it this year. We also gave a single option for “Angular”, but learned from the Angular community that there are two separate and incompatible frameworks with that name: Angular 1.x (also known as AngularJS) and Angular 2.x+ (known as Angular). We gave separate options for these frameworks this year.

What we intend to improve

While we believe our survey significantly improves on last year’s questions, there are still opportunities for further improvement that we hope to make in subsequent years.

Order of questions

Respondents are less likely to answer questions further down in the survey, resulting in a steady decline in the number of respondents for later questions. While in some cases question order matters, we may be able to randomize question order to some degree to ensure a better distribution of respondents.

Number of languages

Our list of spoken languages excluded 15% of respondents; we should have a longer list.

List of territories

There were some unfortunate errors in the names of some territories that reflected political conflicts, and some equally unfortunate omissions.

How respondents differ from the general population

As mentioned above, respondents differ from general JavaScript users in that English-speakers are over represented, with knock-on effects as to which countries are represented.

We did not collect age or gender identity of the respondents, so we do not know whether our survey is representative of the broader population of JavaScript users on these metrics.

We believe well over 15 million individuals were given the opportunity to respond to the survey, but as a voluntary survey answers will always be biased towards individuals willing to give up some time to answer questions about JavaScript. It is possible that our methods for reaching these users creates some bias in our answers towards users who have more affinity to npm as an organization. While we cannot rule out this effect, where we asked similar questions our responses conform to other, non-npm-affiliated efforts like the 2018 State of JS Survey.

Both our survey and the State of JS Survey may have a bias towards users who are more enthusiastic about JavaScript in general. While we cannot detect or correct for this bias, we believe our numbers are still useful as relative gauges, both between options within this year’s survey and between answers in this survey and last year’s.

What conclusions we can draw

Despite some flaws in questions and bias in our samples, our survey contains a wealth of data about JavaScript’s users and we’re excited to produce a series of posts about various aspects of JavaScript usage.

npm, Inc.

npm is the package manager for JavaScript and the world‘s largest software registry. Here are some of our thoughts.

npm, Inc.

Written by

npm, Inc.

npm is the package manager for JavaScript and the world’s largest software registry.

npm, Inc.

npm, Inc.

npm is the package manager for JavaScript and the world‘s largest software registry. Here are some of our thoughts.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade