This piece was originally written for the course "Governing Automated Decisions."


 

The Limitations of Current Student Data Privacy Law

An Examination of University of Arizona’s Student Dropout Prediction Algorithm

 

“Just as Amazon boasts of having a 360-degree view of the customer—being able to predict what an individual will buy before they do it—so too can other organizations leverage data to predict and respond to the evolving needs of those they serve.”

-Sudha Ram, University of Arizona professor[1]

 

For three years, freshmen students at the University of Arizona were tracked through their student ID swipe cards, from every on-campus building they entered to every purchase they made. The amount of time they spent at each location, the time of day, how many other people they encountered—all of this information and more was logged and analyzed, ultimately culminating in the creation of an algorithm to predict the likelihood of students dropping out of school. Sudha Ram, a professor of management information systems at the university, developed the program as a means of improving graduation rates, which is important for a school’s reputation and national rankings. After years of gathering data on freshmen, Ram was able to identify who was most at-risk by looking at daily routines and social interactions—the strongest predictors of dropout—and created a predictive algorithm based on this data. The goal is to implement the tool on a university-wide scale, determining which students are more at risk and targeting them with additional support to remain in school.[2]

Schools have been collecting data on students, including test scores, grades, retention records, and more, for many years. But what seems to be changing is which information is collected, how it is collected, and for what purpose it is being used.[3] This poses a number of risks, as this data can be misused in ways that harm students, and school systems often have poor data security that leave them vulnerable to breaches.[4] Currently, the most comprehensive federal student data privacy law, the Family Educational Rights and Privacy Act (FERPA), allows the guardians of student minors to access education records, grants the right to request corrections, and protects students from external disclosure of student information to other entities.[5] A number of other federal laws covering the data of young people intersect with these protections, such as the Children’s Online Privacy Protection Act (COPPA), which covers what personally identifiable information (PII) websites can collect from young children; and the Protection of Pupil Rights Amendment, which regulates the content of surveys schools can have students fill out. There is also a patchwork of state and local laws regarding student privacy, and student data privacy more specifically.[6]

But neither federal laws nor Arizona state laws have much of relevance to say about the dropout prediction tool developed at University of Arizona. At issue is not external disclosure of information, nor access to collected data, but a potential intrusion upon seclusion of students and administrative overreach within the institution itself—an issue of data minimization. The prediction tool was developed in-house, so to speak, by University of Arizona researchers with access to university data. Specifically, there are two separate but related actions in question: the initial collection of student data while the algorithm was being developed; and the use of this algorithm on students. The former has already occurred, and should be evaluated in regards to current laws and policies regarding student privacy. The latter has not yet occurred, but must be subject to the same critical analysis, in order to guide its implementation. Finally, a separate question is, even if these two actions are legally permissible under current law, are they still ethical or desirable? Is current law insufficient for truly protecting student privacy?

Because current student privacy laws deal primarily with external disclosure to third parties and allowing students and their parents access to their data, they do not address the facts raised in this particular case. While the school’s collection of student data without notice, and its potential future tracking of students for the purpose of lowering dropout rates, might be permissible, this is an indication of the inadequacy of current laws. At the very least, student privacy law should mandate schools provide explicit notice to students about the data they collect, and how they do so. This should cover not just specifically educational data, but other information like location as well. These laws should also allow students the opportunity to opt-out of data collection when it is not necessary to the functioning of the school. Ideally, they would also set limits on the kinds of data schools are permitted to collect from students. These changes are necessary for creating educational environments that foster learning rather than paranoia, and for providing students with mechanisms to fight instances of administrative intrusion. They also give scrutiny to algorithmic systems that purport to provide objective judgments, but which must be closely evaluated for their potential biases and consequences.

 

I. Existing student privacy laws

 

A.    FERPA

FERPA was implemented in 1974, at a time when collection of student data was expanding beyond traditional measures like grades and attendance, to capture personal and familial demographics. The data existed in paper form, stored in filing cabinets that typically never left a school’s administrative offices. Often times, schools would not allow students or parents access to the contents of these student files, let alone give them an opportunity to challenge their contents—nor would they provide sufficient notice to consent to its collection and/or disclosure to third parties. FERPA was introduced as a means for granting parents the right to access and correct their child’s student data (until the age of 18), as well as greater control over student information shared with external parties.[7]

The law covers “education records,” which is defined to mean “records, files, documents, and other materials which…contain information directly related to a student…[and] are maintained by an educational agency or institution or by a person acting for such agency or institution.”[8] This includes PII, such as student names, addresses, Social Security numbers, and other information that is traceable to a student or their family. There are two general exceptions, for which schools do not need parental consent to share: when sharing with “school officials” and when it constitutes “directory information.” The former includes school teachers or staff, or relevant outside contractors who have direct educational interests in the information. In these cases, schools must be diligent that the information is used only for its intended purposes, is not shared with others, and remains directly controlled by the district. The former, “directory information,” includes information typically listed publicly in directories like telephone numbers—schools must still inform the public which directory information it discloses, and allow the opportunity to opt-out.[9]

FERPA was passed at a time when student data existed primarily in paper form and in limited quantity, before the advent of big data analytics and proliferation of private “ed-tech” companies. The law places significant decision-making burden on “school officials,” so that educators are tasked with managing the preferences of students—an overwhelming task without any accountability mechanisms. Given the extent to which data collection has permeated the educational experience, understanding and managing this process proves practically infeasible for teachers, students, and parents. For this reason, Elana Zeide characterizes FERPA as insufficient practically, politically, pedagogically, and philosophically in achieving its original purpose in the era of big data.[10]

 

School Obligations under FERPA

Obligation to inform parents annually of their right to review their student’s education record.

Obligation to establish a process for parents to review and seek to amend their student’s record.

Obligation to inform parents annually of the types of personally identifiable information that could be publicly released as school directory information.

Obligation to provide parents an opportunity to opt out of having any or all school directory information about their student publicly disclosed.

Obligation to seek parental consent in advance for the disclosure of personally identifiable data for any purpose not authorized in law.

 

Parent Rights under FERPA

Right to review their child’s education records.

Right to have school amend their child’s record, if it is inaccurate, misleading, or otherwise in violation of a student’s privacy.

Right to be informed about the types of personally identifiable information schools could publicly release as directory information about their child.

Right to opt out of public disclosures of any or all school directory information about their child.

Right to opt out of disclosure of personally identifiable information about their child for any purpose not authorized in law.

 

Figure A: An explanation of school obligations and parent rights under FERPA, from the Foundation for Excellence in Education.[11]

 

B.    PPRA

The Protection of Pupil Rights Amendment (PPRA), also known as the Hatch Amendment, was passed in 1978 and requires schools allow parents access to surveys and other evaluative material before they’re given to students.[12] It allows parents the opportunity to opt their children out of surveys covering particular topics. PPRA also includes parents in the process of creating policies regarding the collection, use, and disclosure of student PII for marketing or sale to third parties, with an exception for “educational products or services.”[13]

 

C.     COPPA

The Children’s Online Privacy Protection Act (COPPA), while not technically in the realm of student privacy, nevertheless touches on similar concerns. COPPA, enforced by the FTC, lays out the responsibilities of website operators in regards to gaining parental consent to collect information of children under the age of 13.[14] In some instances, schools may provide consent on parents’ behalf, given they understand the details of how the collected information will be used and can ensure it will be to the benefit of the school.[15]

 

D.    State laws

State and local student privacy laws vary by jurisdiction, and the landscape is constantly changing. For the purposes of this analysis, the focus will be on Arizona laws, as this is the jurisdiction in which the University of Arizona falls. Arizona state law adopts FERPA’s requirements in regards to retention and disclosure of “student educational records,” which includes general PII and basic information like attendance, but does not extend this protection to “student level data,” which includes information on student behavior. It requires students give consent to release transcripts to the Arizona militia, the US armed services, or representatives of post-secondary institutions. There are also additional exceptions to FERPA in regards to sharing “student level data” with third parties—for example, public educational institutions may also share this data with the department of juvenile corrections. While the state does not mandate a Chief Privacy Officer for school districts, it does have a student data governance commission, and does allow individual complaints in the event of suspected knowing FERPA or Privacy Act violations. But perhaps most tellingly, Arizona does not allow students or parents the ability to opt-out of data collection or sharing.[16]

 

II. University of Arizona dropout prediction tool

 

Professor Ram’s algorithm was based on three years of collected data, but is yet to be implemented at the school. During the data collection phase, Ram and her team tracked freshmen students through their “CatCard” ID swipes at about 700 locations around campus, including residence halls, the library, the recreation center, academic buildings, vending machines, food services, and more. Putting together this location data from the whole freshman student body, they were able to create maps of behavior patterns showing potential points of interactions between students. If more than one student used their CatCard at the same place around the same time, this increased the likelihood of student interaction. They looked at how these patterns changed over the course of the semester, noting whether it appeared the students’ social connections were strengthening or weakening over time. It also allowed the team to analyze the regularity of routines.[17]

 

 

 

 

 

 

 

 

Figure B: The figure on the left shows traffic patterns of freshmen based on their CatCard swipes in a given area on campus between 10AM and 2PM during a weekday. The figure on the right shows traffic patterns for a given area on campus between 6AM and 10PM during a weekday.[18]

 

As it turns out, interaction with other students and regularity of routines are major predictors of student dropout rates. Those with fewer social interactions and/or without established routines are more at risk of dropping out. In fact, Ram and her team claimed that when this behavioral data was combined with demographic information and other predictive measures like grades, they could accurately predict 85-90% of the freshmen who ended up dropping out before their second year. This is an improvement over the current model the school uses, which provides 73% accuracy in predicting the likelihood of dropping out, based on descriptive survey data and indicators like academic performance and financial aid. Once this new tool is rolled out, the school also plans to create an online dashboard so that advisors can access this data in real-time and assess student risk at any particular moment. They are also considering adding to the algorithm Wi-Fi data from about 8,000 hubs on campus. “It’s really not designed to track their social interactions, but you can, because you have a timestamp and location information,” Ram concedes[19]—though given the usefulness of social interactions in predicting dropouts, it would appear this is precisely what it’s designed to track.

There are no clear limits on what data can and will be used in building this tool, nor how the data will be used or presented. The online dashboard may only be available to the students’ academic advisors, but its intrusiveness creates a sort of panopticon for students.[20] Overall, there are two separate but related issues involved in the dropout prediction tool: the initial collection of sensitive student data, seemingly without any notice or choice of opting-out, during the development of the algorithm; and the deployment of the algorithm on students in real-time. The former has already happened, and raises questions about whether the university violated the privacy of its students. The latter has not yet occurred, but should the tool be used, it will raise questions about surveillance of students and its effects on educational environments.

 

A.    University of Arizona student data privacy policies

The CatCard is the standard ID issued to all University of Arizona students as freshmen, and provides both identification as well as access to necessary services around campus such as residence halls, dining halls, and academic buildings. Each card contains a digital photo of the student, a digital signature, a SmartChip that gives contact-less access to doors and purchases, a randomly assigned ISO number, and a magnetic stripe that allows access to other campus services.[21] In regards to personal information stored on these swipe cards, the CatCard terms and conditions merely state they are “subject to applicable law and university policies,” and “will not be provided to any outside agency by the CatCard office, unless permitted by law.”[22] There is no indication which policies this refers to, though the university does publish its policies in regards to information security and electronic privacy.

The information security policies generally deal with how the university handles and protects data.[23] Included in this is a statement regarding its protection of students who refuse to provide their social security numbers, in accordance with The Privacy Act of 1974.[24] It also includes electronic privacy policies, which allow the university to collect PII from students through email and online forms, but not to release this information to outside parties unless compelled. It claims the university “…does not sell or distribute confidential information it collects online to individuals or entities not affiliated with the University, except in…very limited circumstances.”[25] Missing from the policies is any limitation on how the university itself uses student data internally. Nowhere in the CatCard terms and conditions, nor in University of Arizona’s general policies, does it explicitly state that the data encoded on CatCards in regards to location and purchasing behavior was tracked for research purposes, or that it may be used to monitor students in the future. Students were not given notice of this potential use of their data, and were not given the opportunity to opt-out of the program. Yet, nowhere in the school’s policies does it explicitly prohibit the university from collecting or using data for these purposes either, or mandating the school give notice of it. The university does, however, give notice of the inclusion of CatCard ISO numbers in the school’s “Enterprise Directory Service,” a “repository of ‘live’ identity data” for students, staff, faculty, and sponsored guests.[26]

 

B.    Arizona state student privacy laws

There is little in Arizona student privacy law that is of relevance to the dropout prediction tool. These laws deal primarily with the disclosure of personal student data to third parties, and track with the protections set forth by FERPA. Arizona law also has no formal definition of “educational records,” meaning it is unclear whether the law would even address a school’s collection of data unrelated to specific educational purposes, such as location. Most importantly, there is a notable lack of explicit requirements that students and parents be given the ability to opt-out of data collection by school districts.[27] In excluding this mechanism, Arizona state law permits the university’s collection and use of student data for the purposes of the tool.

 

C.     Federal laws

Once again, federal student privacy laws generally do not touch on information collection within institutions. PPRA allows students and parents to access surveys and evaluation materials that are given to students. COPPA regulates information that children under 13 are allowed to be exposed to on the Internet.[28] FERPA ensures parental access to student data (if the student is under 18), and sets limits on what school districts can share with third parties.[29] Federal laws thus do not cover the dropout prediction tool.

 

D.    Relevant case law

Most of the relevant court cases deal with external disclosure, though there are a number that can help shape how to think about this situation. While instructive, these cases still, it should be noted, deal with disciplinary or investigatory action, not questions of the limit of general data collection. In New Jersey v. T.L.O., the Supreme Court ruled that schools are permitted to search students in cases in which there are reasonable grounds to suspect it will uncover evidence of lawbreaking or school policy violations on the part of the student. The methods used for the search must be related to its objective, and cannot be excessively intrusive. Additionally, the Court found that the Fourth Amendment protects objective reasonable expectations of privacy, but not subjective expectations of privacy.[30] In Veronia School District v. Acton, the Court ruled that a school’s use of random drug screening for student athletes, while technically searches, is not overly intrusive in relation to their purpose.[31] Thus, in light of these rulings, the question becomes: what constitutes a reasonable objective expectation of privacy, and where is the line drawn in regards to proper methods and goals? On which side of this line does University of Arizona’s dropout prediction tool fall?

In United States v. Jones, the Court found that law enforcement officials seeking to place GPS tracking devices on the cars of suspects must get a warrant, and thus the placement of the monitor was, in this case, a trespass. It ruled that 28 days of monitoring Jones’ car via GPS violated his reasonable expectation of privacy because of the length of time and its constant nature.[32] The dropout prediction tool makes use of data from students’ CatCards denoting locations and purchasing behavior. It is not continuous but punctuated, collecting data at frequent enough intervals to paint a general but not exact picture of student activities and locations. And it operates in real-time throughout the academic year. Yet, the reasonable expectation of privacy (REOP) test deployed in Jones is not be appropriate in this case for two reasons. First, it deals with a school’s general practice, whereas Jones dealt with federal law enforcement searches. Second, the “REOP” test has an inherent paradox in that as soon as there is awareness of a particular practice—student tracking, for instance—then one could argue that the expectation of privacy has disappeared. Students thus could not reasonably expect their behaviors to not be monitored, because they are generally aware the school tracks them; but this says nothing of whether students should be subjected to this tracking.

 

E.     Ethics & Risks

On a basic level, the dropout prediction tool would likely make students feel uncomfortable on campus, as if their behaviors were being closely watched. This, by itself, constitutes an argument against using the tool, to be weighed against the utility that its outcome would bring for the school and its students. But harm can extend beyond just an invasion of privacy on its face. There’s also the risk that deployment of the tool would actually create a hostile learning environment that negatively affects educational outcomes. Does a student’s ability to learn and grow lessen when they’re preoccupied with the thought of being monitored and analyzed? And is it possible that, by using predictive tools, schools actually reinforce notions about whether particular students will be successful or not? It seems plausible that such tools run the risk of becoming self-fulfilling prophecies for students.[33] Without speculating too much, one can also imagine ways such a tool could be misused—to stalk students, expel them, exert more granular control over their daily behaviors, or deny them scholarships or admissions. It’s also possible to imagine such a tool having a disparate impact on vulnerable populations. Because the algorithm counts financial status as a risk factor, students from lower socio-economic families would be subjected to much greater surveillance.

It’s worth asking if the initial data collection, and the use of the tool, is meaningfully different than current and historical practices. Schools have, after all, always collected data about their students.[34] ID swipe cards that provide access to on-campus buildings and purchases are encoded with information about students. What’s different about tools like the University of Arizona algorithm, however, is that whereas allowing access based on ID information is directly necessary to how the swipe cards function, and the security of the university, tracking students is not necessary to its overall functioning. Without a verification feature, swipe ID access would not work; without keeping track of student locations and behaviors, the school is still able to function properly. And were the school to use a dashboard to monitor students, as planned, this represents an escalation in the level of granular surveillance schools exert over students.

It’s also unclear where there is any clear line in terms of what data schools are allowed to collect. They have moved beyond educational records and information necessary to function (like names and addresses), to location and behavior. Since more data usually results in better tools, schools will be incentivized to collect more and more student data, and will always be able to justify this by saying it serves the educational goal of keeping students in school. This is likely to grow as machine learning and similar techniques become more ubiquitous; they find patterns in sometimes seemingly unrelated and disparate data, which is then transformed into “relevant” data by the predictions they make.[35] The result is that no data is off limits. This necessitates the drawing of a line between what data is permissible and not permissible for schools to collect from students. This line is currently not drawn.

The data on which the Arizona tool is based, collected from students over a three-year period, may also contain hidden biases or inaccuracies. It assumes that being in the same location as other students at roughly the same time constitutes “interacting,” though it’s hardly obvious this is the case. It might overlook the behaviors of commuter students and those who live off campus, whose on-campus activities might differ from those who live on campus. There might be variations in behavioral norms between groups of people (racial, religious, etc.) that are ignored, making the tool less effective on certain groups of people. As with any social indicator, those measured by this tool may be subject to Campbell’s Law, which states: "The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor."[36] Once students are aware of how the tool works generally, it may incentivize them to change their behavior—whether to avoid or attract attention from advisors—to align with the algorithm’s measurements.

The ultimate utility of the tool must also be questioned. Because it’s based on correlation, not causation, we cannot definitively say that more social interaction and stronger routines lead to a lower likelihood of dropping out. Thus, it’s unclear that this tool would be helpful in the way the university hopes it would. It may identify at-risk students, but if it tells us nothing about causation, administrators would be none the wiser on which remedial mechanism to deploy in response. It’s true that it may help focus time, energy, and resources on specific students, after which advisors can take a clinical approach to resolving the students’ issues. But it’s not clear this greater efficiency—which could potentially be achieved, alternatively, through greater investment in advisors as opposed to algorithm development—justifies the harms to privacy and the overall learning environment. The researchers also concede that the tool results in a number of false positives, which triggers the deployment of resources, unnecessarily, to those who are not actually at-risk.[37] This creates inefficiencies that might negate the usefulness of the tool itself.

 

F.     Other similar tools

University of Arizona’s dropout prediction tool is not the first time a school monitored students and used the data to build a classification algorithm to serve the school’s educational goals. Georgia State University (GSU) created a suite of algorithms, called Graduation and Progression Success (GPS), that uses student demographics and behavior data to alert advisors about students who are struggling in classes, and to help students choose classes and majors that suit them.[38] It was based on 10 years of collected data, and tracks about 30,000 students daily. The school claims to have increased freshman fall-to-spring retention by five points, and has reduced the number of extra classes graduating seniors take.[39] The algorithm identifies 800 specific academic behaviors that predict a struggling student, and alerts an advisor to reach out to a student if they exhibit one of these behaviors. According to the school, in 2003 just 32% of students received diplomas; after implementing this tool, 76% of students who enroll graduate on time (though not necessarily from GSU).[40]

Arizona State University has also experimented with data-driven student targeting. It developed a system to monitor progress students are making in their majors; if students get “off track,” by not registering for certain classes or maintaining certain grades, they might be forced to change majors. It also has a Facebook app that analyzes the data on students’ Facebook profiles and suggests friends based on similarities, in an effort to encourage socializing.[41] Many schools across America have been testing these kinds of tools in efforts to improve education, increase graduation rates, personalize academic experiences, and make more efficient use of scarce resources. Proponents see big data as a means for making education the “great equalizer” by tailoring it to each individual’s specific needs.[42] But these potential benefits must be weighed against the potential harms—privacy and otherwise—that these systems might produce.

 

III. The insufficiency of current student data privacy law

 

Current major student data privacy law is insufficient because it is concerned primarily with a school district’s external disclosure of student education information to third parties. This is true on the federal level, with FERPA and other laws, as well as the state level. In turn, they by and large do not set limitations on the information schools can collect and use from students internally. The result is that young people are left open to levels of surveillance that might violate their privacy or actually harm learning outcomes. The Electronic Privacy Information Center (EPIC) highlights California’s Student Online Personal Information Protection Act (SOPIPA) as a model for student data privacy laws. Broadly, the statute prohibits K-12 service operators from targeting students with ads based on their data, building profiles of students for commercial ends, selling student data, and disclosing this data, with certain exceptions.[43]

When this law was enacted in 2014, it was one of the strongest in the country. Yet it can still be improved in many ways, as can student privacy laws more generally. First, it could extend protection to all students, not just K-12. Second, it could create stronger enforcement mechanisms, providing students with private rights of action against abusive companies and schools. Third, it could mandate schools and third parties make public the information they collect and for what purposes. Fourth, it could require data retention limitations, so student data is not kept in perpetuity. Fifth, it could allow students request the data held about them, and delete or correct it if necessary. Finally, it could limit the type of data schools and companies collect, particularly if it is not directly related to educational purposes, such as biometric or social media data.[44] This last suggestion is most relevant to the dropout prediction tool at University of Arizona, which includes location and behavioral data. A new law could regulate the collection of this data by schools; or alternatively, it could allow the collection but set limits on how schools and companies are allowed to use this data.

At the very least, schools should be required to provide notice to students that: one, their data is being collected for the purposes of building predictive algorithms; and two, that their data is being collected and subjected to said predictive algorithms in order to make certain determinations. University of Arizona failed to do the former through its CatCard and university policies when building its dropout prediction tool, though it did anonymize the data.[45] It has not implemented the tool yet, but should it do so, the school should provide explicit notice to the students about its deployment. It also seems reasonable, at both stages, to allow students to opt-out of these programs, particularly if they collect sensitive data that is not directly relevant to educational outcomes. While “notice and choice” models might not be the ideal form of data privacy protection, they are certainly preferable to the opacity with which schools can now collect and use student data.[46] This is the minimum amount of care schools should take towards student data privacy. For even more protection, the law could provide clearer guidelines on what information schools may and may not collect about/from students, to prevent schools from adopting intrusive measures.

 

IV. Conclusion

 

The dropout prediction algorithm developed at the University of Arizona raises a number of student privacy questions. Did the school violate university policies, or state or federal laws, in collecting the data without providing notice to students? Would implementing the tool, with or without notice, violate said policies or laws? The answer to both appears to be no, that the tool is permissible. Current law deals primarily with external disclosure of student data, rather than set any kinds of limits on internal collection. This is an indictment of current laws regarding student data and privacy. Because this tool, and others like it, collect data beyond explicitly educational information, it borders on intrusive, subjecting students to high levels of surveillance. This may have a negative effect on learning outcomes, if students are preoccupied with the thought of being constantly monitored and evaluated. The tool in question also raises questions about the data on which it is based—what kinds of biases, inaccuracies, and blind spots might lie in the data? Will the tool have a disparate impact on certain populations as a result of the data on which it is based? There is also no clear line in terms of what schools are allowed to collect from students, opening the door to the collection of increasingly personal and sensitive information in the pursuit of lowering dropout rates, or other supposed educational goals. At the very least, student privacy law should mandate schools provide students notice of this kind of data collection; ideally, they would allow the ability to opt-out as well. The law could also draw a clearer line around what data schools may and may not collect from students, as the demand for their data will only continue to grow.

 

References

 

[1] See Alexis Blue, Researcher Looks at ‘Digital Traces’ to Help Students, UANEWS (March 7, 2018), https://uanews.arizona.edu/story/researcher-looks-digital-traces-help-students

[2] Ibid.

[3] See Center for Democracy & Technology, State Student Privacy Law Compendium, BAKERHOSTETLER (October 2016), https://cdt.org/insight/state-student-privacy-law-compendium/

[4] See Sonja Trainor, Student data privacy is cloudy today, clearer tomorrow, KAPPAN MAGAZINE (February 2015), Vol. 96, No. 5, http://journals.sagepub.com/doi/pdf/10.1177/0031721715569463

[5] See 20 U.S. Code § 1232g (FERPA)

[6] See Sonja Trainor, Student data privacy is cloudy today, clearer tomorrow

[7] See Elana Zeide, Student Privacy Principles for the Age of Big Data: Moving Beyond FERPA and FIPPS, 8 DREXEL LAW REVIEW 339 (2016) (online corrected), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2821837

[8] See 20 U.S. Code § 1232g (FERPA)

[9] See Sonja Trainor, Student data privacy is cloudy today, clearer tomorrow

[10] See Elana Zeide, Student Privacy Principles for the Age of Big Data

[11] See Foundation for Excellence in Education, Protecting K-12 Student Privacy in a Digital Age (December 2015)

[12] See 20 U.S. Code § 1232h (PPRA)

[13] See Sonja Trainor, Student data privacy is cloudy today, clearer tomorrow

[14] See 15 U.S. Code Chapter 91 (COPPA)

[15] See Sonja Trainor, Student data privacy is cloudy today, clearer tomorrow

[16] See Center for Democracy & Technology, State Student Privacy Law Compendium

[17] See Alexis Blue, ‘Digital Traces’

[18] Ibid.

[19] Ibid.

[20] See Michel Foucault, Discipline and Punish: The Birth of the Prison, VINTAGE BOOKS (April 25, 1995; originally 1975)

[21] See Financial Services Office: CatCard, UNIVERSITY OF ARIZONA, https://catcard.arizona.edu/

[22] Specific dates of when this data was originally collected from students is not available. However, the CatCard terms and conditions stretching from 2010 to the present are available through the Internet archive. This period of time most likely captures the period of data collection, as the research was published in 2015.

Current policies (2015-2018): https://catcard.arizona.edu/sites/catcard/files/catcard_terms_and_conditions.pdf

2013-2015 policies: https://web.archive.org/web/20141029233538/http://uabis.arizona.edu/eforms/forms/catcard_terms_and_conditions.pdf

2010-2013 policies: https://web.archive.org/web/20120130213032/http://uabis.arizona.edu/eforms/forms/catcard_terms_and_conditions.pdf

[23] See Information Security: Policy and Guidance, UNIVERSITY OF ARIZONA, https://security.arizona.edu/policy

[24] See Federal Privacy Act and SSN Usage IS-G1001, UNIVERSITY OF ARIZONA (2008), https://security.arizona.edu/sites/securitysiab/files/isg1001.pdf

[25] See Electronic Privacy Policy IS-1000, UNIVERSITY OF ARIZONA (March 9, 2010), https://policy.arizona.edu/information-technology/electronic-privacy-policy

[26] See Financial Services Office: CatCard, Enterprise Directory Service (EDS), https://catcard.arizona.edu/dept-services/eds

[27] See Center for Democracy & Technology, State Student Privacy Law Compendium

[28] See Sonja Trainor, Student data privacy is cloudy today, clearer tomorrow

[29] See 20 U.S. Code § 1232g (FERPA)

[30] See New Jersey v. T.L.O., No. 83-712, 469 U.S. 325 (1985)

[31] See Veronia School District 47J v. Acton, No. 94-590, 515 U.S. 646 (1995)

[32] See United States v. Jones, No. 10-1259, 615 F. 3d 544 (2012)

[33] See Sophie Quinton and National Journal, Are Colleges Invading Their Students’ Privacy?, THE ATLANTIC (April 6, 2015), https://www.theatlantic.com/education/archive/2015/04/is-big-brothers-eye-on-campus/389643/]

[34] See Center for Democracy & Technology, State Student Privacy Law Compendium

[35] See Kate Crawford & Jason Schultz, Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms, 55 BOSTON COLLEGE LAW REVIEW 93 (2014), Vol. 55, Iss. 1

[36] See Donald T. Campbell, Assessing the Impact of Planned Social Change, Occasional Paper Series Paper #8, Dartmouth College (December 1976), http://portals.wi.wur.nl/files/docs/ppme/Assessing_impact_of_planned_social_change.pdf

[37] See Sudha Ram, Yun Wang, Faiz Currim, & Sabah Currim, Using Big Data for Predicting Freshmen Retention, THIRTY SIXTH INTERNATIONAL CONFERENCE ON INFORMATION SYSTEMS (2015), https://aisel.aisnet.org/cgi/viewcontent.cgi?article=1380&context=icis2015

[38] See Sophie Quinton and National Journal, Are Colleges Invading Their Students’ Privacy?

[39] See Student Success Programs: GPS Advising, GEORGIA STATE UNIVERSITY, https://success.gsu.edu/initiatives/gps-advising/

[40] See Martha Dalton, How Georgia State Stopped Students From Slipping Through the Cracks, WABE (December 18, 2017), https://www.wabe.org/georgia-state-stopped-students-slipping-cracks/

[41] See Marc Parry, Big Data on Campus, THE NEW YORK TIMES (July 18, 2012), https://www.nytimes.com/2012/07/22/education/edlife/colleges-awakening-to-the-opportunities-of-data-mining.html

[42] See Andrew Giambrone, When Big Data Meets the Blackboard, THE ATLANTIC (JUNE 22, 2015), https://www.theatlantic.com/education/archive/2015/06/big-data-student-privacy/396452/

[43] See State Student Privacy Policy, ELECTRONIC PRIVACY INFORMATION CENTER, https://epic.org/state-policy/student-privacy/

[44] Ibid.

[45] See Sudha Ram, Yun Wang, Faiz Currim, & Sabah Currim, Using Big Data

[46] See Solon Barocas & Helen Nissenbaum, On Notice: The Trouble with Notice and Consent, PROCEEDINGS OF THE ENGAGING DATA FORUM (October 2009), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2567409