Students, Data, and Blurred Lines:
A Closer Look at Google’s Aim to Organize U.S. Youth and Student Information
By Tracy Mitrano, Ph.D., J.D.
Introduction
As it is transforming the world, technology is transforming education. We see it all around us: at the national level with President Obama’s ConnectED initiative, to the state level with Governor Cuomo’s Smart Schools Commission, to the local level with schools across the country adopting cloud services and devices. Through technology, learning can be customized to meet the needs of the child, teachers can see in real-time where a student is struggling and needs support, and collaboration becomes easy with tools such as file-sharing and online discussion forums.
As technology transforms the classroom, Internet companies and government leaders have a vital role to play in assuring that the privacy and integrity of the data that existed in the physical world for youth and students remains consistent in the digital era. Particularly now, as these technologies are taking off in schools, it is urgent that the companies behind these technologies are compliant with federal law and respectful of the school districts, colleges, universities, and students whom they serve. Internet cloud providers must remain diligent about the present dangers and future concerns that revolve around digital records of youth and students in order for American society to achieve the full potential that technology has to offer in classrooms, teaching, learning, and research. To the extent that certain Internet business models such as targeted advertising may conflict with our expectations for the privacy and security of student data, it is essential to shape these business models through appropriate regulatory intervention now, before trust in technology in the classroom is irremediably compromised. This paper will offer a case study of how one such company, Google, participates in this dynamic.
The New Information Economy: Consumerization of Online Services, Cloud Computing, and Profiling
Computers have been used in schools for decades. In the past, they would be loaded with educational software and used to teach children tasks such as document creation, computational work, or other research activities. These computers were often isolated from the outside world before the advent of the Internet; subsequent to the public use of the Internet, on-premise systems maintained the privacy and integrity of education records and data about youth and students. K-12 schools, colleges, and universities did not share those records with third parties.
In the last several years, new consumer services have emerged for file- sharing, storage, and social networking such as Gmail, DropBox, Twitter, Skype, YouTube, and Facebook. The notion of services for “free” has encouraged their popularity with users. How these companies use personal data is not well known nor understood even by savvy digital users, however. “Click-through” Terms of Service, which consumers neither read nor understand, give these companies permission to do just about anything they want with that data, including profiling individuals, selling the data, or using it for increasingly sophisticated marketing and targeted advertising. To appreciate fully the nexus between technology and the market is to make sophisticated connections between the value of data and a global information economy fueled largely by marketing, advertising, and user profiling.
Profiling
The underlying business model for these “free” services is a three-step process. First, an Internet company embeds code in web pages for keeping track of web sites visited, posts made on social media sites, products purchased, locations visited (based on mobile device location services), and many more attributes associated with an individual’s online actions. All of this information is combined into a personalized online profile. Internet companies typically auction space on web sites and in search engines so that an ad is matched to the user’s interests. Payoff to Internet companies occurs when the user clicks on the ad.
With search and online profiling, the marketing can be much more targeted. The more the Internet company knows about a specific user, the more effective the ads. Likewise, the more sites, services, or even devices controlled by the Internet company, the more information that can be gathered about the user, making the ad matching service increasingly effective. Expanding reach is critical for these online marketing companies. In order to attract more users, these companies will build more services and make the services free to garner ever-increasing amounts of personalized data. Revenue earning potential expands as a function of that information and has, overall, become the standard business model for the most profitable and popular Internet companies.
Cloud Computing in Schools, Colleges, and Universities
Concurrent with the rise in popularity of these consumer applications is the emergence of enterprise cloud computing. Enterprise cloud computing delivers services such as storage, email, document creation, collaboration, and other programs through Internet companies that hold the infrastructure, applications, and the data on their own premises to contracting parties, including K-12 schools, colleges, and universities. Cloud computing services generally operate on a per-user subscription model. For educational institutions, cloud computing has real benefits, especially in the areas of reducing cost, overhead, and staffing. Thus, K-12 school districts, colleges, and universities contract directly with the vendor for the services. Central to those contracts are provisions that explicitly require compliance with federal law to protect education records under the Family Educational Rights and Privacy Act (FERPA).1 The vendor promises to act as a “school official” by not disclosing a student’s education record apart from recognized statutory exceptions such as health and safety of the individual student.
A key point in these relationships is the respect for the institution’s statutory obligations and the students’ privacy. In practice, that means that at no point does the cloud provider have a legal right to use and/or resell education records for its own commercial purposes. These school-vendor relationships are purposefully designed to meet the mission-driven needs and compliance obligations of school districts, colleges, and universities.
Internet companies and educational institutions, parents and students, must be clear with each other in negotiations and contract formation about the technological and business practices of enterprise cloud computing. With a consumer “click-through” license, the end user assumes the risk of disclosure; whereas, with an enterprise contract, parents and students place their trust in school districts, colleges, and universities that such a disclosure will not occur.
Embedded in that trust relationship is the public policy recognition of the vulnerabilities particular to the age and stage of development of students. Students require privacy to learn from mistakes without fear of exposure or embarrassment. Speech and curiosity could so easily be chilled if a person thought that something they did or said while in that critical formative process could later in their life be used against them. Within this protected zone to develop strong intellects and open hearts lies the hope that a student may grow into a well-educated, productive member of the U.S. economic workforce and a vibrant citizen in our democratic society.
The Family Compliance Privacy Office of the Department of Education has made two points clear regarding cloud computing. First, K-12 schools, colleges, and universities cannot ignore their obligations under FERPA by outsourcing the processing and handling of education records to third party vendors. The obligation follows the records, and the institution remains responsible for its vendor’s compliance, which should be made clear in the contract. Second, data-mining or any other use of education records for the vendor’s own purposes — including, but not limited to, advertising and other commercial purposes — is a per se violation of FERPA.2 Thus, it is critical that schools, colleges, and universities have sufficient transparency regarding the technologies and business purposes vendors put to the education records under their control. Informed consent rests on this knowledge, as does the responsibility that educational institutions assume to contract on behalf of students. Ad-revenue, subsidized services are neither legal nor appropriate for educational institutions. An Internet company that mines education records for its own business purposes acts against both law and public policy.
The Children’s Online Privacy Protection Act (COPPA)3 also comes into play for any companies delivering online services to children or students who are under the age of 13. The seven rules that must be adhered to apply “…to operators of commercial websites and online services (including mobile apps) directed to children under 13 that collect, use, or disclose personal information from children…”4 In particular, parents must be given the opportunity to provide “verifiable parental consent,” as well as the ability to restrict, review, or prevent further use of information that has been collected. As with FERPA, adequate notice and informed consent are a prerequisite for the use of protected children’s information. Any commercial uses, including the build-out of online profiles, fall squarely under this rule.5
While some companies have recognized the need for compliance and have respected those rules in both contract formation and technological/business practices, others have ignored those obligations, obfuscated reasonable inquires, and deceived contracting educational institutions. Under those circumstances, it is impossible for the school, college, or university to exercise informed consent. One company that is attempting to play in both the consumer world and the academic world warrants close scrutiny. That company is Google — the world’s largest online advertising company.6 Notably, Google has joined other education service providers to sign a voluntary, industry-backed Student Privacy Pledge.7 Among the commitments listed in the pledge are: no profiling of students other than for authorized educational purposes and no use of student information for targeted advertising. The information that follows in this paper will highlight outstanding questions and concerns relating to Google’s business practices in schools and why Google may have been so reluctant to originally sign on to the pledge.8 Moreover, Google’s history of playing fast and loose with policies and promises about privacy suggests that government, the education sector, parents, and students might do well to retain a healthy skepticism toward the actual value of Google’s signature on this Pledge.
Why Focus on Google?
The answer is in many ways obvious: it is the most economically powerful and technologically advanced Internet company in the world. Google is not only a $50+ billion a year online advertising powerhouse, it is also the dominant online search provider 9; the dominant player in mobile platforms (with the Android platform)10; a leader in online email services 11; a major player in mapping services (Google Maps)12; online video (via YouTube)13; web browsers (via Chrome)14; and numerous other online services.
The less obvious answer is that Google has established a pattern of, forgive the cliché, “do not ask permission, beg forgiveness,” regarding regulation. Google Street View, Google Buzz, and Safari By-Pass decrees evidence this pattern. In all, the principal issue upon which Google has faltered, and the F.T.C. has acted, is privacy. In the education sector, this behavior is especially egregious because it preys upon our society’s most vulnerable members: youth and students.
Nor is the connection between market success and a focus on youth and students coincidental. Google still makes almost 90% of its revenue from online advertising.15 What better population to target than youth and students? Not merely are they impressionable by the entertainment and advertising culture that dominates U.S. society today, but the longer trajectory of information gathered about a person — from the beginning of their digital presence on throughout their lifetime — plays to Google’s business and commercial needs. Privacy, that nebulous but nonetheless critical quality to be reasonably enjoyed by all people — not least of which are youth and students — hangs in the balance.16
Google began with a business model based on providing purely consumer- oriented services — similar to Facebook. Under this model, Google was highly successful and profitable, delivering ads that directly relate to an online user’s interests, browsing history, location, and “millions” of other attributes that it gathers about online users.17 However, unlike Facebook, Google then made a push into highly regulated verticals such as health care, government, law enforcement, education, and the private sector (via the recently repackaged “Google for Work”). Now, all of Google’s services and products act as an engine for collecting data and building massive personal profiles on every user of its offerings.
So while Google is prominent in the lives of adult consumers, it also has carefully positioned itself as central to the lives of young children as well. Over 40 million students and teachers around the world use Google Apps for Education (GAFE).18 In the U.S., over 10,000 schools are using the service.19 And now we see Google Chromebooks becoming a fixture in schoolchildren’s days.20 Chromebook sales are expected to hit 5.2M units this year and 14.4M units by 2017, 21 as American schools in particular adopt the devices at an explosive rate (shipments of Chromebooks increased 67% in the third quarter of 2014 compared to the previous quarter 22 and Chromebook shipments have just surpassed Apple’s iPad for the first time in the U.S. education market 23). According to Google, Chromebooks are now approaching 50% share in U.S. education.24 Even when kids aren’t in the classroom, Google is eyeing the 12-and-younger population — most recently announcing that it will create specific versions of some of its products geared towards this user base.25
Lastly, for reasons not of its choosing, a window into Google’s technological processes has been opened. The discovery phase of the Gmail litigation case in the U.S. District Court of Northern California, San Jose, has brought some remarkable technological and business practice facts to light. While surely not the first company in the history of the United States to have irregularities exposed between its practices and its statements, because it is so powerful a corporation, because it plays such a significant role in most people’s lives, and because it is the trend setter among Internet corporations, these gaps illustrate tensions that ultimately inform the critical issues of essential governance in the twenty-first century information society. With this, it is worth taking a closer look at these gaps, especially given the light it sheds on Google’s deceptive practices in the area of its services and applications in education.
Gmail Case
In 2013, a group of plaintiffs in a California federal court brought suit against Google alleging that it intercepts and scans their email without consent in order to create user profiles that Google uses to operate other services, including advertising. Two of those plaintiffs were users of Google Apps for Education (GAFE). Federal District Court Judge Lucy H. Koh explained the plaintiffs’ claim:
“After [date redacted], Google separated its interception of emails targeted for advertising from its interception of emails for creating user profiles. As a result, after [date redacted], emails to and from users who did not receive advertisements are nevertheless intercepted to create user profiles. Accordingly, these post-[date redacted] interceptions impacted all Gmail and Google Apps users, regardless of whether they received advertisements.”26 [emphasis mine]
While ad serving and user profiling are distinct processes within Gmail, until 2010 they operated at the same point in the email delivery process, since both were triggered only after an email was actually opened. But by 2010, Google realized that tens of millions of users were escaping the user profiling process, known as Content OneBox, because they used versions of Gmail where for one reason or another the ad serving process was disabled or absent. Some of these users were accessing Gmail through smartphone apps, which didn’t display ads due to their limited screen size. Others were using GAFE and Google Apps for Government (GAFG), which did not serve ads by default. Google therefore decided to move the Content OneBox profiling process upstream in the email delivery process to a point before the actual delivery of messages to user inboxes.27 Thus, as revealed in court documents, sometime between September-October 2010 and on, all inbound Gmail messages were analyzed for user profiling purposes before they were delivered to users — regardless of whether these users were being served ads or not. This meant that messages sent to smartphone users and GAFE users would be analyzed in just the same way as ordinary Gmail messages. Indeed, Google even began to analyze messages that users themselves deleted without opening.
This distinction between serving ads and data-mining/profiling has proved nettlesome in the history of enterprise contracts with schools, colleges, and universities. First, Google offered only “contracts by URL,” thus meaning that the substantive provisions in a contract remained at Google’s discretion to change without notice to the college or university. Data-mining and profiling practices were never mentioned in those contracts or even in their URL statements. Instead, something of a linguistic shell game emerged. If representatives of schools, colleges, and universities asked Google about its data-mining/profiling practices, Google’s stock response promised not to serve ads. If legal counsel or chief information technology officers had some concern about the nexus between “ads” and data- mining/profiling, negotiations with Google did not allow those concerns to be fully expressed or adequately explained. In the main, Google offered only sales people to discuss the contracts, not lawyers, even after college and university attorneys emphatically insisted on such discussions. In many of the earliest cases, the failure of Google to bring lawyers to the table resulted in contracts that did not even include FERPA provisions. Under pressure from institutional counsel to include that language, Google eventually added those provisions but still failed to provide either counsel or chief information officers with sufficient information by which to allow for these representatives to exercise informed consent for the service with respect to FERPA.
A perfect storm of factors converged by which many schools, colleges, and universities capitulated to Google. First, internal financial pressures have fallen heavily onto information technology as the internal unit that presidents, provosts, and chief financial officers look to trim institutional expenses. “Free” email looked particularly attractive under those circumstances. Externally, the economic downturn of 2008 and years following exacerbated those pressures. Second, by 2010, Google began to include FERPA provisions, though still failed to explain its technological and business methodologies. Third, and central to this discussion, is that Google obfuscated its technological practices by transposing “ads” as a response to queries about data-mining and profiling. Even when institutional counsel asked directly, Google refused to be transparent about its technological and business processes. Without the requisite knowledge of how and in what ways the technology operated to Google’s benefit and at the expense of the schools, colleges, and universities — and specifically, their students — institutions had neither the information nor the resources to inform their consent.
Were it not for the chink in Google’s armor that the discovery process of the Gmail litigation yielded, one might chalk these discrepancies up to the gaps that emerge in periods of rapid technological and business transformation. The documents that emerged from this case confirmed the suspicions of many chief information officers and institutional attorneys regarding data-mining and business practices in GAFE contracts, consistent with Google’s established pattern of purposefully forging ahead of existing law. Precisely to the point of their concerns, the judge in the Gmail litigation, Judge Koh, ruled that:
“Google points to its Terms of Service and Privacy Policies, to which all Gmail and Google Apps users agreed, to contend that these users explicitly consented to the interceptions at issue. The Court finds, however, that those policies did not explicitly notify Plaintiffs that Google would intercept users’ emails for the purposes of creating user profiles or providing targeted advertising.” [emphasis mine]
Thus, in March 2014, with pressure building, Google publicly acknowledged that it was indeed scanning the emails of GAFE users for ad-related purposes, but refused to deny that it also profiled students in GAFE:
“A Google spokeswoman confirmed to Education Week that the company “scans and indexes” the emails of all Apps for Education users for a variety of purposes, including potential advertising, via automated processes that cannot be turned off — even for Apps for Education customers who elect not to receive ads. The company would not say whether those email scans are used to help build profiles of students or other Apps for Education users, but said the results of its data mining are not used to actually target ads to Apps for Education users unless they choose to receive them.” [emphasis mine]
Shortly thereafter, on April 30, 2014, Google published a blog post announcing that it would discontinue “ads scanning” in GAFE contracts. This statement came without any additional explanation despite a statement that Google made just a few weeks earlier that GAFE ad-scanning is “100% automated and can’t be turned off”:28
“We’ve permanently removed all ads scanning in Gmail for Apps for Education, which means Google cannot collect or use student data in Apps for Education services for advertising purposes… We’re also making similar changes for all our Google Apps customers, including Business, Government and for legacy users of the free version, and we’ll provide an update when the rollout is complete.”
Google’s statement that it had “removed all ads scanning” notably was silent on profile scanning. Once again, Google transposed language about “ads” to cover up data-mining technologies and commercial use of education records. Additionally, Google did not mention that when enabled by a school administrator, the GAFE toolbar contains both enterprise and consumer apps, such as Gmail and YouTube respectively. As a result, GAFE users may leave the protected enterprise environment and enter consumer applications not covered by their school, leaving them subject to ads and related scanning without receiving notice or the opportunity to opt-out.
This case highlights Google’s misrepresentations to students, parents, teachers, and school administrators. It demonstrates that Google did not provide those parties with requisite information about its technological and business processes. Google stripped these parties of meaningful informed consent and made it impossible for educational institutions to determine whether Google would, could, or did meet regulatory obligations. Google did not serve proper notice or provide opt-out provisions related to its email scanning processes for the purposes of targeted advertising and creating user profiles. It refused institutional counsels’ requests to negotiate in kind with Google attorneys. It offered “contracts” that are not proper contracts under the Uniform Commercial Code (U.C.C.). Those documents operate by changeable — and frequently changed — URLs. Indeed, to date Google continues to offer contracts to colleges and universities that no matter what the provisions state nonetheless include a fail-safe “out,” a default reference to their consumer Privacy Policy. In other words, Google builds in layers upon layers of excuses and prospective defenses to their on-going pattern of deception.
Google obfuscated a clear response as to whether it data-mined education records for its business purposes of profiling. It transposed policy about its use of ads to cover up clear answers about profiling. Finally, it deceived GAFE users. For years, Google made a clear promise on its website stating, “Note that there is no ad- related scanning or processing in Google Apps for Education or Business with ads disabled.”29 Google removed this sentence from its website in the same month that the Gmail litigation suit revealed allegations that Google scanned all GAFE messages for user profiling, even when no ads were served, because at that point in the controversy the utility of the linguistic shell game had finally lost its ability to deceive K-12 schools, colleges, universities, parents, and students (see Appendix A).
Comparing the findings of the court document revelations with Google’s most recent pledge to protect student data raises serious questions as to Google’s changes in practice and further applies pressure to a business model in conflict. Corporate behavior suggests doubts about Google’s genuine commitment and sincerity in realizing the full meaning of the Pledge. In this case, timing was telling. Four days after the President spoke at the F.T.C. on January 12, 2015, during which he stated that for companies that don’t sign on to the Student Privacy Pledge, “we intend to make sure that those schools and those parents know you haven’t joined this effort”
– and four days in advance of the State of the Union address — Google quietly added their name to the list of signatories. For a company with a finely-tuned public relations machine and strong brand image, this end-of-the-news cycle, silent signing resembles past behaviors that signal smoke and stalling, not an actual commitment to compliance.
Conclusion
No matter how much we all might value innovation, it is not worth the damage incurred through deception of schools, colleges, and universities serving society’s most vulnerable members: youth and students. No matter how much we enjoy the fruits of an economy enlivened by the Internet, it is not worth the cost of undermining essential principles of American society that prize privacy as a prerequisite to personal autonomy. In taking advantage of minors — which the law explicitly protects in the consumer arena with COPPA and in their role as students under FERPA — Google has demonstrated a willful indifference to their needs. Ultimately, Google has blurred the lines of valid assurances and left students without the privacy safeguards to which they are afforded by law.
It is clear given Google’s well-established track record for aggressive practices in pushing legal boundaries to expand its abilities to collect and mine data that without strong government engagement a right to privacy as we know it will ultimately cease to exist. In a 39-page motion filed in June 2013 to have a class- action data-mining lawsuit dismissed, Google stated “a person has no legitimate expectation of privacy in information…turned over to third parties.”30 A balance between innovation and law is achievable through the regulatory bodies designed to correct such abuses and market failures. To that end, an investigation into Google’s behaviors and practices in GAFE contracts is warranted. Google must be made to account for its deceptive and misleading statements. The purpose of enforcement of course is not to punish for the sake of punishing. Rather, it is to obtain a positive change in future behavior. Thus an investigation into Google’s deceptive practices in the education market must focus above all on establishing the essential limits to the profiling-based targeted advertising model in our schools.
As a result of Google’s unsanctioned breach of student privacy, years of student data has been gathered and used for “ads scanning” purposes.31 Google should be required to disclose the specifics of the data collected (what was collected, for how long, how it was used, and, if appropriate, disgorge themselves of it). Additionally, Google should be required to disclose the circumstances under which GAFE users have been and continue be subject to profiling. Without adequate disclosure, informed consent cannot be given. If Google had a better record of adhering to consent decrees, perhaps such a measure would not be necessary. But particularly in the context of the Buzz consent decree, this type of deception — unsanctioned data use and profiling — could be viewed as its most significant transgression to date. Google’s well-established pattern of “not asking permission, but begging forgiveness,” has resulted in a loss of the public’s trust even as we celebrate its innovative advances.
The technological ability to track a user from a consumer application back and forth to educational apps is too easy to deploy and too tempting to assume that Google’s advanced algorithms are not today functioning in this manner. Looking forward, educational contracting parties should not be made to guess about practices without a proper foundation. Google has actively resisted the production of such information, in part because it is at odds with its commercial incentives. In fact, Google has acted to the contrary. Nor, without government action, will Google be accountable. Its bargaining power is too significant and its technological edge is too sharp a match for even the most sophisticated higher education institutional contracting parties.
Rather than be trustworthy and transparent, it has chosen to send sales people instead of attorneys to negotiate contracts, it has exercised linguistic word games to avoid compliance issues important to the integrity of educational institutions, and it has demonstrated obfuscating behaviors in continuing to pressure those institutions into signing “contracts by URL.” It has even been willing to sign a trade association Pledge to hedge scrutiny, but without the quality of transparency, enthusiasm, or assurances that indicate real change or transformation of its technical and business practices to date. School districts, colleges and universities, parents and students have reasons based on past practices to remain wary; the burden now rests on Google to be transparent and verify that it is in fact behaving in accordance with the Pledge’s promise of student data privacy.
No patent need be disclosed to answer simple questions regarding practices that track youth and students. No competitive advantage need be lost to respect the obligations of educational institutions and the public that they serve. The time to grant Google the benefit of the doubt about these essential legal and public policy issues has passed. It is time to investigate and correct Google’s actions in the name of privacy for youth and students. Only with these actions can the full promise of new technologies be made comfortably consistent with the enduring expectations the public places on schools, colleges, and universities to serve the larger aspirations and goals of American society.
Appendix A
Google’s Security and Privacy FAQs
For years, Google made a clear promise on its website stating, “Note that there is no ad-related scanning or processing in Google Apps for Education or Business with ads disabled.”
August 21, 2013, screenshot (statement was in place from at least 2011 through sometime between August 21-September 30, 2013):
Google removed this sentence from its website in the same month that the Gmail litigation suit revealed allegations that Google scanned all GAFE messages for user profiling, even when no ads were served, because at that point in the controversy the utility of the linguistic shell game had finally lost its ability to deceive K-12 schools, colleges, universities, parents, and students.
September 30, 2013, screenshot (statement was removed):
Footnotes
1 U.S. Department of Education, “Laws & Guidance: FERPA General Guidance for Students,” http://www2.ed.gov/policy/gen/guid/fpco/ferpa/students.html. As noted, “The term “education records” is defined as those records that contain information directly related to a student and which are maintained by an educational agency or institution or by a party acting for the agency or institution.” In other words, FERPA applies to all education records such as emails, faculty notes to students, work product, papers, exams, attendance data, grades, and transcripts.
2 U.S. Department of Education Privacy Technical Assistance Center, “Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices,” http://ptac.ed.gov/sites/default/files/Student%20Privacy%20and%20Online%20Educational%20Services%20%28February%202014%29.pdf
3 Federal Trade Commission , “Children’s Online Privacy Protection Rule (“COPPA”),” www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-online-privacy-protection-rule
4 Federal Trade Commission, “Complying with COPPA: Frequently Asked Questions,” Section: General Questions About the COPPA Rule, www.business.ftc.gov/documents/0493-Complying-with-COPPA-Frequently-Asked-Questions#General Questions
5 Federal Trade Commission, “Complying with COPPA: Frequently Asked Questions,” Section: COPPA and Schools, www.business.ftc.gov/documents/0493-Complying-with-COPPA-Frequently-Asked-Questions#Schools. As noted, “…the school’s ability to consent on behalf of the parent is limited to the educational context — where an operator collects personal information from students for the use and benefit of the school, and for no other commercial purpose” and …“the operator must provide the school with full notice of its collection, use, and disclosure practices, so that the school may make an informed decision.”
6 Market Realist, “Why Google continues to dominate the online advertising market,” September 23, 2014, http://marketrealist.com/2014/09/google-continues-dominate-online-advertising-market/. See also, CNET, “Google makes more money from ads than print media combined,” November 12, 2012, www.cnet.com/news/google-makes-more-money-from-ads-than-print-media- combined. As noted, “The search giant generated $20.8 billion in ad revenue in the first six months of 2012, while newspapers and magazines in the U.S. made $19.2 billion according to Statista.”
7 Student Privacy Pledge, http://studentprivacypledge.org/
8 Bloomberg BusinessWeek, “Obama Takes On Google With Law to Protect Privacy of U.S. Kids,” January 21, 2015, http://www.bloomberg.com/news/articles/2015-01-21/obama-takes-on-google-in-law-to-protect-privacy-of-u-s-kids
9 comScore, “comScore Releases April 2014 U.S. Search Engine Rankings,” May 16, 2014, http://www.comscore.com/Insights/Market-Rankings/comScore-Releases-April-2014-US-Search-Engine-Rankings, Google is reported to have 67.6% of the online search market.
10 IDC, “Smartphone OS Market Share, Q3 2014,” http://www.idc.com/prodserv/smartphone-os-market-share.jsp Android is reported to have 84.4% share of worldwide smartphone shipments.
11 TechCrunch, “Gmail Now Has 425 Million Users, Google Apps Used By 5 Million Businesses And 66 Of The Top 100 Universities,” June 28, 2012, http://techcrunch.com/2012/06/28/gmail-now-has-425-million-users-google-apps-used-by-5-million-businesses-and-66-of-the-top-100-universities/
12 comScore, “comScore Reports April 2014 U.S. Smartphone Subscriber Market Share,” June 4, 2014, http://www.comscore.com/Insights/Press-Releases/2014/6/comScore-Reports-April-2014-US-Smartphone-Subscriber-Market-Share. Google Maps is the #1 mobile mapping app on 41.5% of devices.
13 comScore, “comScore Releases October 2014 U.S. Desktop Online Video Rankings,” http://www.comscore.com/Insights/Market-Rankings/comScore-Releases-October-2014-US-Desktop-Online-Video-Rankings. Of the 191.5 million Americans that watched online content, 191,456,000 viewed content on Google Sites or virtually 100%.
14 Shareaholic, “Web Browser Usage Trends,” May 16, 2014, https://blog.shareaholic.com/browser-share-report-05-2014/. Google Chrome has largest browser share with 35.65%.
15 Google Q3 2014 Quarterly Earnings Summary, https://investor.google.com/pdf/2014Q3_google_earnings_slides.pdf
16 Business Insider, “Google’s Top Futurist Says Your ‘Privacy May Be An Anomaly,’” November 22, 2013,www.businessinsider.com.au/google-vinton-cerf-declares-an-end-to-privacy-2013-11. Vin Cerf, “Father of the Internet,” and paid Google evangelist, has stated, “Privacy may be an anomaly.”
17 Medium, “Google Mines Gmail for Big Data Gold,” Jeff Gould, November 11, 2014, https://medium.com/@jeffgould/google-mines-gmail-for-big-data-gold-cea42e9b88ee. As the piece notes, “Gmail’s algorithms classify users into literally “millions of buckets.”
18 ZDNet, “Google upgrades Classroom apps with more tools for educators,” October 14, 2014, http://www.zdnet.com/google-upgrades-classroom-apps-with-more-tools-for-educators-7000034661/
19 New York Times Personal Tech, “Chromebooks Win Buyers Ready to Live in the Cloud,” May 21, 2014, http://www.nytimes.com/2014/05/22/technology/personaltech/chromebooks-win-users-and-some-respect.html?_r=2
20 Google’s Chromebook device is bound by the same Google Privacy Policy as all other Google products and services, www.google.com/intl/en/policies/privacy/ (see “Information we collect” and “How we use information we collect”). In order to use a Chromebook, the user must login with a Google account, which ties the user into Google’s cloud infrastructure and enables profiling and tracking. Given these points, Google has the right to use all data that is input into a Chromebook computer, including all information typed by a student while working at school.
21 The Guardian, “Google Chromebooks eating into US education market, says Gartner,” August 11, 2014, http://www.theguardian.com/technology/2014/aug/11/google-chromebooks-education-market-gartner
22 ABI Research, “Chromebook Shipments Increase by 67% QoQ as Acer Dominates 37% of the Market in 2Q 2014,” October 21, 2014, https://www.abiresearch.com/press/chromebook-shipments-increase-by-67-qoq-as-acer-do
23 Business Insider, “Google’s Chromebook Is Killing The iPad In One Key Market,” December 1, 2014, http://www.businessinsider.com/google-chromebooks-outsell-ipads-education-2014-12
24 OMG!Chrome!, “Chromebooks Near 50 Percent Share of Education Market in US,” October 2, 2014, http://www.omgchrome.com/chromebooks-50-percent-education-share-us/
25 USA Today, “Google to revamp its products with 12-and-younger focus,” December 3, 2014, http://www.usatoday.com/story/tech/2014/12/03/google-products-revamped-for-under-13-crowd/19803447/
26 Order Granting in Part and Denying in Part Defendant’s Motion to Dismiss at page 4, http://www.consumerwatchdog.org/resources/GoogleGmailOrder092613.pdf
27 Medium, “The Natural History of Gmail Data Mining: Gmail isn’t really about email — it’s a gigantic profiling machine,” Jeff Gould, June 24, 2014, https://medium.com/@jeffgould/the-natural-history-of-gmail-data-mining-be115d196b10
28 Way Back Machine, Google Apps for Education — Security and Privacy, April 2014,https://web.archive.org/web/http://https:/www.google.com/edu/privacy.html
29 Way Back Machine, Google Apps — Security and privacy overview, August 2013,https://web.archive.org/web/20130821070235/http:/support.google.com/a/bin/answer.py?hl=en&answer=60762
30 CNET, “Google filing says Gmail users have no expectation of privacy,” August 13, 2013, https://web.archive.org/web/20130821070235/http:/support.google.com/a/bin/answer.py?hl=en&answer=60762/. As noted in the article, “Google has made it clear that people who send or receive e-mail via Gmail should not expect their messages to remain private.”
31 Official Google for Work Blog, “Protecting Students with Google Apps for Education,” April 30, 2014, http://googleforwork.blogspot.com/2014/04/protecting-students-with-google-apps.html