Digital education platforms are also a veritable gold mine of information for researchers looking to improve education, as they generate data on how millions of students are learning.

Ethical and legal challenges arise. It’s a way to responsibly share personal information without exposing students to the possibility of it being exposed to the outside world.

A consortium of education researchers and learning platforms is currently developing a potential solution. Researchers never see the actual data.

The project, called SafeInsights, is led by OpenStax at Rice University. $90 million in grants It is supported by a five-year grant from the National Science Foundation.

The idea is as follows safe insight It acts as a bridge between learning platforms and research partners, with collaborators fleshing out how the exchange works to protect student privacy.

“Under normal circumstances, we would be taking data from educational websites and apps and giving it to researchers to study, analyze and learn from,” said SafeInsights Executive Director and OpenStax Technical Director. said JP Slavinsky of JP Slavinsky. “Instead, we’re taking researchers’ questions to that data. This creates a safer research environment that makes it easier for schools and platforms to participate because the data stays where it already exists.”

Deeper insights at scale

Another way to think of SafeInsights is as a telescope, according to Slavinsky and colleague Richard Baraniuk, founder and director of OpenStax, which publishes open access course materials. This will allow researchers to peek into vast amounts of data from the University of Pennsylvania’s massive online open courses and learning platforms such as Quill.org.

Researchers create questions, convert them into computer code that can sift through data, and feed them into a learning platform. Once the results are generated, they are returned to the researchers without sharing the data directly.

“This is truly a partnership, with researchers working with schools and platforms to jointly solve interesting problems,” Slabinski said. “We’re offering that telescope so that other people can bring in their own research questions and questions they want to answer. So we don’t get too involved in what specifically is being asked. , we focus on getting as many questions answered as possible.”

One reason this model is so powerful, Baraniuk says, is how it can scale up the scale at which educational research is done. Many studies have small sample sizes of about 50 college students participating as part of a psychology class, he explains.

“A lot of the research is about college freshmen, right? Well, this is not representative of a wide variety of students,” Baraniuk says. “The only way we can see how widespread it is is by doing a large-scale study. So the first key behind SafeInsights is that these digital education websites that are literally hosting millions of students every day. and partnering with apps.”

Another aspect of this project that he believes will open new doors for researchers is the diversity of the student population represented by the learning platform partners. Partners include educational apps and learning management systems for reading, writing, and science.

“The idea is that by putting all these pieces of the puzzle together, we can get a more complete picture of these students on a much larger scale,” Baraniuk says. “Our big goal is to try to remove as much friction as possible so that more useful research is done and more research-backed pedagogies and techniques are applied in practice. , how do we remove that friction and still really keep everything protected?”

Building trust and protecting privacy

Before research is conducted, SafeInsights partners at the Future of Privacy Forum are helping develop policies that will shape how the program protects student data.

John Verdi, senior vice president of policy at the Future of Privacy Forum, said the goal is to build privacy protections into all systems. As part of this, he is helping develop what he calls a “data enclave,” a process that allows researchers to query data in learning platforms without direct access. Other aspects include helping develop review processes for how research projects are selected, training researchers on privacy, and publishing lessons learned about bringing privacy to the forefront.

“Even with good technical safeguards, even with good ethical reviews, ultimately it is up to the researchers themselves to decide how to use the system responsibly.” They need to understand how the system works.”

Student data privacy protection in education is generally “woefully underfunded,” but protecting that information can help students trust learning platforms and ultimately create research opportunities like SafeInsights. , he says.

“It’s wrong to put the onus on students and parents to protect their data,” Verdi said. “Instead, what we need to do is build a digital infrastructure that respects privacy by default. [that] We ensure that your information is kept confidential and used ethically. ”



Source

Share.

TOPPIKR is a global news website that covers everything from current events, politics, entertainment, culture, tech, science, and healthcare.

Leave A Reply

Exit mobile version