Peter Bergman EdTech Plenary Presentation

Peter Bergman EdTech Plenary Presentation

Show Video

You. Okay. So I'm gonna take a little bit of time initially to discuss some. Of the research we've. Done summarizing. What works and education technologies, and, I'm, gonna rely heavily on the work of my colleagues, at the poverty action lab who have done a full literature review of the, RCTs, and other, types of studies around, education, technology, in the u.s. settings so we'll be focused obviously on the u.s., heading for this talk and. When. I discuss this I think it's important to think about not, only when technology, can reduce inequalities, but also when. They can exacerbate them, and this I think is perhaps. Particularly, relevant in the, education, space so, in thinking about these inequalities, let's, I'll, discuss them in terms of the. You, know what does this actually mean in terms of student learning so think, about the the 90/10 achievement, gap so what, does the gap in learning between students. Who come from families in the 90th percentile, the income distribution versus. Students. Who come from families in the 10th percentile, the income distribution so. We'll see core cohorts. Of children on the x-axis here and then learning. As. Measured by test scores in terms, of standard deviations, on the y-axis and I'll talk more about what that means shortly, so. Here's what it looks like the achievement, gap over time and. You. Know I think folks, like this comes from Sean Riordan Rick Hanna check I've also studied this and there's, some debate about whether it's actually growing as we've seen here but what, people do agree on is that it the gap in learning between. Students. From high-income versus, low-income families. Is it, is large and it is persistent over time so. What does it mean if we look at you, know kids burn and born in the early 80s the. Gap in learning between, students. Coming from lower income backgrounds, to higher income backgrounds, is about a full standard deviation, and, what. Does a full standard deviation. Mean if, you look at how, much students learn say from fourth grade to eighth grade that's, roughly, equivalent to a standard deviation, so. That means kids from low-income backgrounds are four, years of learning roughly. Speaking behind. Kids from higher income backgrounds and by the, time you get to the 2000s. You know they were talking forward to six years of. So these gaps are extremely. Large and when, we think about what a standard deviation means it'll also help us benchmark how. Effective, an intervention, might be at closing, this, gap potentially. Great. So. We're gonna I'm going to cover a couple of topics I certainly, don't have time to cover, everything so I encourage you to look at J. Pals ed tech literature, review for more. Information, on these studies they're very nice summative. Tables, and. You. Know we can think about this, as I mentioned before benchmarking. These impacts and, this. Again I think is specific, to the u.s. setting where if you were looking at lower-income country settings this scale might be different but. We, generally think that if you know impacts on the range of 0 to about a tenth of a standard deviation are small. Impacts. Between point one and point two five these are encouraging, so for. Instance a standard deviation increase, in teacher quality, is between point one and point one five standard. Deviations of, learning, and then, when you get up to 0.25 to. A point four zero these. Are large impacts, so for instance. The. Highest, quality charter, schools tend to generate, impacts in this this range here and then, anything above point for this is you just you rarely see this in the u.s. so pretty. Much anything that's point four and above is probably, high intensity, tutoring, that's pretty, much the only thing that that gets to to that range in the in the u.s. context. Okay. So first I'm going to talk about access, to technology, so computers, internet etc information. And there's. A lot of randomized. Trials in this space if it actually, gets there. Marcie. Okay. So. We can look at computer provision there's been several randomized trials about this, rob, fairly at UC Santa Cruz Jonathan. Robinson they did a randomized, trial providing, computer, access in k-12 they, found no impacts on learning, there but Rob has done other, studies, providing, computer, access to students in community. Colleges found some more promising, evidence there and you. Know one caveat to this is there could be impacts, on other. Outcomes we care about like computer, skills but. You know that would obviously be longer-run, to look at. Internet. At home or in schools there's been studies about this and in England and the u.s. with the e-rate program, no, impacts, on learning outcomes but surprisingly.

Perhaps This is broadband, access in. Particular but, some evidence may be that it can, improve college enrollment, rates in in. Post-secondary, and. Then. Lastly. You know we can think about providing online information as, a form of access well and here. We see really sharply, diverging. Usage. Between sort. Of the haves and have-nots so for, instance, you. Know one, aspect. That I've looked at is. Providing. Information online to parents about how their kids are doing in school and being able to look at it log into a Parent Portal look at your child's homework grades etc and. Here. We can see the share of parents or students who have ever logged in on the y-axis and, then, the, share of low-income students, in the school on the x-axis and you, see its sharply, negatively, correlated and this is true for other kinds of online. Access to information as well you. Know for instance I did a study looking at when the LA Times published, teacher value-added, ratings and their hope was to sort of acquaint, access, to this information and so that all families could respond to it and instead, it was the kids of hire, kids. Families, with higher test score kids who tend, to be sorting into the classrooms, of higher value-added, teachers, so, this is one area where you really have to worry about kind, of exacerbating. Some of the inequalities, we care about closing. What. About computer, assisted, learning so these sort of personalized, learning, platforms. These. If. This, works. There. We go. So Cal, or computer assisted learning is this. Is where you can, students. Can login via, computer, to take to. Learn math or reading so. It, can provide rapid, feedback for students on how they're doing with respect to certain skills, this feedback is also transmitted, to teachers sometimes parents, it. Has the advantage of being. Scalable, but also you know I think people tout this adaptive. Content. Feature. And you. Know what this means is that teachers, are often. Teaching. Classrooms, with, a wide write from students, with a wide variety of abilities and so it's hard to target, instruction. In these classrooms and the, idea is that the software can tailor content to. Meet students where they're at in terms of their learning. Proficiency. I'm put it in quotes here because, adaptive. Is not, very well defined it, can often be very ad hoc you. Know my friend, Matt Greenfield says like you know worksheets, are adaptive, you can just sort of pull them off the shelf and target, instruction just as well that way perhaps so I think that's kind of an area of personalization. Which in. Practice, could be better. Studied, and perhaps. Optimized. Further. There's. Been a number of randomized, trials of computer adaptive learning I'll focus, on one assistance. Which was started, by a computer. Scientists. At WPI. And this is sort of a math homework help, software. So teachers can use it to provide math content, to. Students and they. Use it for about 40. Minutes a week, and there, was an independent randomized, trial of this, particular platform in Maine. Of all places. And. I say that in, part because Maine has, an important contextual. Distinction, where they have a one-to-one laptop program, which might make. This program more feasible but. The randomized trial found pretty. Sizable impacts point two standard deviations overall, and then, particularly for students.

From A. Below-average. Baseline. Performance the, impacts were larger point three and. As, I mentioned deepal is sort. Of called through a number of these studies and and generally, they're quite positive, but, almost all the benefits seem to recruit ooh math and not English there, is some. Negative. To null effects and I think a lot of the issue here which could be better. Studied, is around implementation, sometimes. It's very difficult to disentangle, whether. Software. Is failing. Because of implementation. Challenges, or simply because it's. Ineffective, or whether those two things interact as well. This. Is a quite promising, areas that personalized. Or computer adaptive learning, lastly. I'll talk a bit about. Technology-based. Nudges, so this covers a range, of areas kind of a catch-all, but, it could be. Engaging. Parents in early childhood education which. I'll talk about as an example next I think very importantly, improving, the flow, of information from schools. To parents, so you. Know. It's remarkable how much how little children. Tell their parents about how they're doing in school and how important. This kind of intervention can be and that's, remarkably. Ubiquitous. Across the world how that's true. Information. About how to transition, and succeed in college the the matriculation, process is often complicated and there can be potential, assistance their. Mindset. Interventions, fall into this space as well and. Text. Message based adult, education as well as a been, study to as, well focus one since we're here at Stanford on some conducted. By some former Stanford researchers, ready, for K and so. C. Why am I so. Much, there. We go let's just go through them all, so. This was text messages to parents they, did this in several settings and for, pre-k and kindergarten, parents, of pre-k and printer to kindergarten, students and they, would send these facts about, you, know how important, early, literacy. Was they that's what they're focusing on was literacy, skills, they provide, tips about what. Can parents do to help their, children learn, literacy. And fluency and then, they would kind of extend these these. Tips through these, growth messages, as well so parents were receiving three text messages a week so. Here's one example point, out the first letter in your child's name and magazines, at the store and on, signs have your. Try, make it a game and who can find the most and I'll focus on their their impacts, on changing. Parental behaviors, and, so these improved, you know they have sort of a index. Of measures here showing. That. Parent. Engagement improved, by about almost, a third of a standard deviation the. Test score impacts, their. Initial study found comparable. Test score impacts illiteracy I believe, in their their subsequent, study in kindergarten, it, was more mixed depending on whether it was sort. Of personalized, messages, or, standardized. Messages. Great. So let me talk sort of a bit higher level. Go. Through all of these.

There. We go, alright. So I my. Overall sense is there's no panaceas. Here as I mentioned it's very hard to find interventions, that are closing, the gap by point for standard, deviations, and so. I think what this means is we have to think more like a. Diagnostician. So students. Have a range of issues and obstacles they're facing I used to be a former teacher and I remember. Reading. My students, IEP s and sort of the the range of contexts and problems, that are coming from we're. Just totally. Disparate, and, so, I think this means we have to understand one, evaluation, what works but then to how, do we target what works for whom can we identify who will, benefit the most from a particular intervention. I had. One study where asking. We. Were texting parents, about all their child's missing assignments and grades and at baseline I asked, the teachers to guess who do you think would benefit most from this intervention and then we could look at ex post you. Know where their predictions born out and they had no clue and, you would be interesting to know if that's sort of more commonly true but the targeting, we see that I would see teachers do is often towards just, the most struggling students, and it's not clear that those, students always benefit, the most from a particular intervention. So, I think large-scale evaluations, are still important, but they're very expensive, and time-consuming it's. Not clear to me you know I think some us researchers, who, are kind. Of these outliers can, pull this off and do it but it's. Probably maybe a. Better for more specialized, shops like Mathematica. Or Rand. And, I'd, say one, other piece that we lack, research. On and particularly, on in terms of randomized control trials is how do we make sure that something particularly. Education technology. Are implemented, with fidelity we. Really don't know as much about that space I think it's time thing much better studied, in the lower-income country context, around health. And other technology, products but not so much in. The US. I. Think a faster, path path, to innovation I sort. Of just harping on the same themes of the conference here is through. These researcher. Platform. Partnerships, and, I think we can leverage these platforms. To use their data test new ideas and. Then scale it across their existing. Distribution networks, and I think, the idea is that you, know my experience we're often bringing philanthropic dollars, for. Like a one-time, fixed grant to do the development work and then the low variable, costs allow the company, to sustain it subsequently, over time so. A couple of examples of this from from my work going back to the gradebook platform, which saw, this disparate usage, by low-income, and high-income families well. You, know my. Original study I was texting out this information, by hand over. The weekends, but you know it's not that advanced. Technology to scrape this information, from the learning management systems. Get. The SIS, data to. Use to. Get students names, and phone. Numbers and then use Twilio Zhai api to pre-populate, text, messages and push them out automatically, so, we did this to improve these school to parent information, flows and we, had three types of alerts that we integrated, onto an existing learning. Management systems, platform. So we had. There. We go we, had low grade alerts so if a child's grade dips below 70, it would fire off a text message once a month we, had missing assignment, alerts because again we had access to all the grade books we could pre.

Populate Messages of how many assignments the student was missing in a given class and, then, we had by class data on absences. So, students are about twice as likely to miss an individual, class as they are a full day and so we could text those out. Weekly as well and, we, found that this had a significant. Positive, impacts if I can still. Have managed to advance the slide somehow, I, don't. Know where the receiver is for this right, here. There. We go. So, we sent out 32,000. Text messages across 22, schools for about $60. We, reduced course failures by about a third, boosted, class attendance by 17%. We had in. My original study in LA where's me punching out text messages we improve GPA. By about two-tenths of the standard deviation and by, about a tenth in a more, scaled up study and, then, we're talking with existing companies, to, leverage their distribution, networks to. Scale this up so speaking. With one of the largest, providers of text. Messages between schools, and parents the. Second largest learning management system, and as, well as an attendance, platform, as well I'll, just give one last example here you. Know that we can use their their, data for other things once we have this information we have by class attendance iterate, information, so we worked, with the platform we could do this social network analysis, of who skips class with whom so, here's this, this graph here, identifying. Which. Students skip class with other students, and we. Could test you know we had layered this intervention on there where we were attending. Randomizing. These alerts to parents, and we could see did the intervention spill over onto. Students, in in. Their network so if Hunt and I skipped a lot, of class together I'm receiving my parents are receiving the text messages, is Hunt now more likely to attend. Class now as well and we did find these significant. Spill overs there and we. Use work by Susan. Both for estimation, and inference to. See. If we could in theory target, this these kinds of attendance interventions, more cost-effectively. Alright, one, last example coming, from the housing space and I. Think this speaks to this sort of high fixed cost low variable, costs this, platform. Go section 8 comm provides housing, listings, for low-income families so. You could think of it as like a Zillow for. Section. 8 recipients, and importantly. Zillow and truly they provide information about neighborhoods, like school quality grade schools ratings etc. Go. Section 8 was not they provided, a walk score and nothing. Else really about the neighborhood so, we. Raise some some money and got them to randomize the addition of school quality information, on to their listings, and, what. We we worked within the Department of Housing Urban Development to see if we affected how families search for housing and where they ultimately moved, and so we found that they did move to areas with better schools and this, because, these ratings were so closely, tied to the SES, of the schools it was also, reducing, the segregation, in the schools and the. Platform you know once they had paid for the development costs scaling. It was easy so now you can go on the website we've. Added other, search features and it, scaled to everybody on the platform and we can work with them now we're trying to a be test different information, value-added. Test. Score levels etc and, exploring. Whether we could kind of suggest. Certain listings, to them based on their prior search behavior as well so. I'll finish up, maybe. With a point, that Tory and others can expand. Upon if I can actually. Continue. To advance the slides. Here. We go. Wonderful. I'll, just go to the end. So. You know I think why are some obvious innovations. Left on the table for instance adding, school quality information or improving school to parent and parent. Information, flows the. EdTech marketplaces, is a, typical. In some respects so. You know I think others have mentioned it in some spaces but the, end users are not the ones in charge of procurement, I think this can leave. Some obvious benefits, on the table, and. The, incentives are not well, aligned here so often EdTech, is chart focused. Around administrative. Pain points which. Crowds out student needs it's you know I talked to my friends in the space and they're sort of getting this constant, flow of requests for crap features on their LMS rather, than you, know oh let's we could just easily program some text messaging features on here which have proven benefits, I, think the incentives, are often tied to inefficient.

Inputs As well and just one obvious one is compliance, and. Then school district leaders they often care barb the perception of efficacy, than. The reality, of it as well, so this speaks to how, much do they actually value. Evidence-based. Interventions. I think. It's maybe a mixed bag there so. You know how and where can we innovate in this space I mentioned one way I'll be interested here Tory and Amy's, thoughts on on. Others you know I'll, stop there.

2019-11-25 16:59

Show Video

Other news