Karen Sandler

Powered by

Introduction and Early Childhood

00:00:00

Educational Journey and Cooper Union Experience

00:06:23

Law School and Corporate Legal Career

00:11:47

Joining the Software Freedom Law Center

00:16:10

Medical Device Advocacy and Personal Experience as a Cyborg

00:20:00

Evolving Concerns: Control and Access to Critical Software

00:25:52

Outreachy and Transition to Software Freedom Conservancy

00:34:55

Recognition and Honorary Doctorate

00:43:16

Future of Technology and Hopes for Change

00:47:35

Final Reflections and Call to Action

00:51:03

On Being a DJ and Closing Remarks

00:52:28

Karen Sandler

00:00:00

Elisabetta Mori

It's the 8th of May, 2024. And I am Elisabetta Mori, an historian of computing currently based in Italy, and an oral history interview with Fosdem. Today I'll be talking to Karen M Sandler. She's an attorney and the executive director of Software Freedom Conservancy. She is known as a cyber lawyer for her advocacy for free software, particularly in relation to the software on medical device. She was the executive director of the GNOME Foundation and the general counsel of the Software Freedom Law Center. She's also a lecturer in law at Columbia University. In 2023, she received an honorary doctorate from Chi Leuven, Belgium. And we are recording under story io. I am in Livorno, Tuscany, Italy and Karen is in the US. Okay. Can you can we start with, you know, you describing your childhood, your family, your first encounter with computers?

00:01:16

Karen Sandler

Sure. I grew up in the suburbs in the United States. Um, my parents had grown up very poor. Um, they were the children of, uh, parents who had fled persecution in Eastern Europe. And so they had fled to the United States, um, with nothing. Um, and, uh, they, uh, they really built this life for themselves. And so they felt this big responsibility growing up. Um, and they had a real emphasis on education and, um, making use of all of the opportunities that were available to us. My father was an engineer and my mother was a school teacher. And so together, that made this real atmosphere with a love of learning and, um, really just it was a childhood where, uh, my mother, who was very interested in archaeology, would take me to museums constantly and to lectures, and my father would take me to the planetarium for lectures on astrophysics. And, um, he was, uh, more of like an engineering scientist. And so, uh, he was quite technical, and he was one of the first people using, uh, computer code in order to, like, using computers to replace the, um, the manual, um, crunching of numbers. And so his big contribution to his field, he was an expert in vibrations, was to write a computer program that analyzed vibrations in different mediums. And so when I was a little girl, he was constantly working on this program and on this, um, this analysis. And it was really fascinating to grow up with him around the house because things like, um, you know, when I was a little girl, I think he had already it was the very end of computer punch cards being a thing. And so they had all of these computer punch cards that they didn't know what to do with in his office. So he brought them home and they were constantly all over the house. We used them to take notes when people called on the phone or to like, leave little messages for each other. And so I it's funny because, like, this was like an ever present part of my childhood. And I remember one of my earliest memories is going to his office with him as a, as a, as a little girl and watching the punch cards fly through the machine and seeing how I used to imagine the way that the punches looked were it looked like animals to me, and I would imagine the different animals moving. And I remember, um, I must have been like, you know, 2 or 3. And then, um, when I got even a little older, um, you know, when I was four, we would play these really like just old style video games in his office. And then of course, because he did this, he had computers in our house. So I was very lucky that I had access to computers and learning how to use them and, um, and playing games. And I mean, I think I wrote my first really, really, really, really silly program when I was probably six years old, you know, or 5 or 6 years old. Um, I, uh, I think maybe when I was seven, I was learning, uh, we learned the, the times tables at school, and it was a totally a different time. The teachers would say, oh, you know, we're going to do a quiz every day from this is Friday. On Monday, we're going to start doing a quiz every day. And as soon as you kids learn like get 100% on this quiz, we'll move your chair to the opposite side of the room. And I said, oh, that has to be me. I have to move to the other side of the room. And so I went home and I wrote this little game to test myself on playing, you know, my times tables. And I played it all weekend. I made a little Ascii rocket go up the screen when I got it right, and I loved it. So it was really the first functional piece of code I ever wrote. And then on Monday, I went to school and I felt so bad. Like in retrospect, I remember the reactions of the other kids because they just said, oh, you know, this is starting on Monday. And they didn't think about it, but I got 100% on that because I had practiced played that game all weekend. And so I moved to the other side and it took it took a couple of days for them to learn their times tables. So the other kids came over to the other side of the room. But I remember it fondly because, you know, code coding was like, uh, it was something that was fun, but it was also something. It was a way that I could do things that I wouldn't otherwise be able to do. And so that that was basically my. And so I grew up really, I think really fortunately. And then my like I we were we were talking before this, Elizabeth you asked me like what was the first computer that I remember, um, interacting. And aside from the, I guess, the the the punch card situation, the one that I remember the most from when I was little at home was we had this portable computer and it was like this huge box with a teeny tiny, like greenish screen and black screen, and it had a big handle on it. So you could, like, carry it like a huge suitcase. Um, and so we used to, I, you know, that we used to, we used to and the keyboard would like, you could put the keyboard and attach it to the, um, to the computer. So that was pretty cool.

00:06:23

Elisabetta Mori

That was pretty cool. Really? Yeah. The portable, you know, they were portable. You could at least you could move them because it's until at some point you couldn't move computer from the big rooms they were in. So that was a big, a big, uh, advancement. So, um, what, uh, what about education? What about school? Which which schools and universities did you attend? child?

00:06:49

Karen Sandler

Well, when I was a kid, I just went to the the local public schools for elementary school and, uh, middle school, although we called it junior high then and, uh, high school and, um, and then I went to a school called Cooper Union for um, for undergraduate, uh, which is super special because it's an in the United States, colleges are very, very expensive. And, um, it was one of the it was the only school that was completely, um, tuition free. And so, um, and so I went there and it was amazing. It was, you know, founded by, like on the principle that, um, that tuition should be as free as the air in the water. And, um, and it's interesting because Cooper Union was, um, at one point invited to become a part of Columbia University, but because Cooper Union accepted women and refused to stop accepting women, they couldn't. They wouldn't be. They they decided not to join Columbia University, which is fascinating the way that the whole history thing went. So that so that that happened then. So yeah. So I went to Cooper Union, which is where, um, you know, previously there was no formal computer related education in my, um, elementary school or my high school, no computer programming classes, nothing. Um, and I basically had been doing a ton of, like, basic programming at home, um, and just doing whatever I wanted. And then when I went to college, it was very rigorous. And because I'm old, um, the, uh, the first class I had to take in college was, uh, C and Fortran and Fortran because I went to engineering school, um, back when again in the olden days. And so it was very it was it was really fun. And that was like my first, um, introduction. And when I got to the school, the very first thing on the very first day of my very first class was to like, get an account in the computer center, because of course, then nobody had I, I had a really like, ridiculous, like notebook computer, as they called them at the time, which, uh, which I felt very lucky to have because nobody really had it, but it was a castoff from my, my dad, and it had like, uh, a teeny tiny amount of ram, like, ridiculously tiny. But you could you could use it to, to, uh, to, you know, just to, to do, um, you know, text based things. And so I, I loved it and I had that at home, but, um, but nobody else had a computer, so you had to have, like, an account so you could use your, um, you can, you can, um, you you had your, your workstation and you could log in. So I went in to get my, uh, my account with another, um, woman who was in the class, uh, in the incoming class with me. And we went to the computer center. And at the computer center, there were, um, there were no women in the computer center. And it was super weird because, uh, uh, there was actually a guy who had Ascii Keep porn on his workstation. And the friend that I was with, who was my new friend because it was the first day of college and we didn't know each other very well, said, I'm going to come back later. And she left, and I don't know what got into me because I'm very, very shy as a human being like. And I have some like I'm very reluctant to to I'm very conflict averse. Um, but for some reason I was full of boldness for my first day of university. And so I went into the computer center and I told the, um, Bob Hopkins, who ran the computer center. I said to him, you know, it's really gross out there. I don't know what you think is going on out there, but you probably want to do something about it. And he said, you're hired. I said, what? And he said, you know, like, you're right, you're right. You know, I want you to learn. And then he called in one of the the guys who worked in the computer center, and he said, I want you to, like, just teach her everything. You know. I want her to have root as soon as possible. And, um, and so I, I worked in the computer center from the very beginning of my college, um, experience, and that was truly amazing. We installed Linux Labs and, um, and it was really, you know, I mean, this is the 90s. So it was like really early stuff and, um, and a ton of fun. And Cooper Union, I got that whole free education. I cannot speak more highly. I think that that was so transformative for my career. And the fact that it was ideologically founded really stuck with me. I was not a principled person coming into college. I didn't really think much about societal society's problems, but the two have been such a direct beneficiary of that. And actually, my father had gone to Cooper Union, too. Um, and they they certainly could never have afforded an education otherwise. Um, if he hadn't had. So the fact that that, um, that that happened and that had such an impact on my family has really stuck with me. And I think that part of why I'm so focused on public good now is in part because of how grateful I am for these structures that helped me and my family succeed.

00:11:44

Elisabetta Mori

So what did you do after graduation?

00:11:47

Karen Sandler

So after Cooper, I went straight to law school at Columbia Law School. Um, and I can't really give you a good reason why I did that. I really thought I was going to do physics grad school. But, uh, my, uh, my family was, was sort of, uh, pushing me towards the law. Um, just really to find a profession that was more secure financially, which I totally respect. And, um, I thought law could be interdisciplinary, and, um, it would be fun. So, um, I got into Columbia Law School and I went there, which was really great. I took a whole variety of classes. Um, and, uh, and then when I graduated, I kind of fell into a corporate law job. So they have, like, on campus interviewing, um, and all the law firms come and they try to recruit. And at the time, the market was really booming. Um, and so there were a lot of, um, of lucrative corporate opportunities. And as I said, I wasn't very civic minded at the time. Um, and I hadn't really thought too much about, um, about doing any kind of public good work. Um, so I did all this on campus interviewing, and one of the law firms said, if you come work for us because it's a summer internship and then you get hired for the job afterwards. So if the summer internship is the big thing. And so they said, if you come work for us for your summer interview, we'll let you spend half of the summer in Hong Kong. And I was like, you'll work in our Hong Kong office. And I was like, yeah, like, I want to go to Hong Kong. So. So that's how I chose to become a securities lawyer. I took that summer internship and I had such a good time doing it. Um, when I was in Hong Kong, in the office, the law firm announced that they were merging with an English law firm called Clifford Chance, and they were merging with a US law firm. And the people in the Hong Kong office said, You don't really want to work for those like we don't know anything about that other law firm. They're, you know, they're doing their own thing. You want to stay with us. So we're going to recommend that you start working in London. And I was like, ooh, London sounds nice. So, uh, yeah. So I wound up starting my career in the London office of Clifford Chance as a cross-border securities lawyer doing transactional work. And, um, and that was great because I was able to bring my technical background to help evaluate these all of these technological companies that were very large. And, um, I really enjoyed it. I moved back to the New York office, um, and the partner I worked for, uh, shifted over to another law firm called Gibson Dunn, which was really fantastic. And so I switched over with him, and most of my clients were in Brazil, and I really enjoyed that for a while. And then I was sort of like, ah, I don't really know. Um, like, this isn't really for me for the long term. Um, and I was starting to reevaluate my, um, my situation. And so, um, and so I was one of the first people at the law firm to quit with nothing lined up. It's like super common now, um, I think this is one of those being an xennial kind of situations was that people didn't really do that before us, but, um, so I did that and it was sort of, uh, um, it was it caused a lot of like, waves because people were talking about, oh, did you hear that? Karen quit with nothing lined up just because she was, you know, the partner I worked for was starting to use his political capital to help me make partner, and I just didn't want him to have to do that because I knew I didn't want to make partner. So I. I gave a very long period of notice. I let him choose. So I worked for six more months, and then I was going to take a, I was going to take another six months off before I started looking for a job. But then I heard that the Software Freedom Law Center was hiring. The founder, Eben Moglen, had, uh, been my professor in law school, and so he asked me if I wanted to, apply there because he he knew I was technical and we had, um, you know, I'd been in his class and so that's how I wound up at the Software Freedom Law Center even before I wanted to start looking. So I got very, very, very lucky. And I was someone who really thought that. I really thought that open source was cool. As I said, in college, I installed those Linux labs. I really thought that it was a neat thing, but I did not think that I, I didn't really think too much about the societal implications of it or, or what it might mean for me, but I still felt so lucky to have landed in such a cool job.

00:16:10

Elisabetta Mori

So, um, this was 2005, 2006?

00:16:17

Karen Sandler

Yeah, I think it was 2005.

00:16:19

Elisabetta Mori

2005. And you work for the Software Freedom Law Center from 2005 to 2011?

00:16:27

Karen Sandler

Well, it's such a neat job because I met so many amazing developers and I heard from them, You know why they contributed? Why they were so passionate about free and open source software. You know, whether it was the folks that were, um, you know, really the the copyleft advocates, the folks working at the Free Software Foundation or people who were passionate about the collaboration component, like the folks that were my clients at the Apache Software Foundation. And then, um, most interestingly, perhaps, was my where my clients like the ones, um, at X.Org, um, who were really when I met them, were very adamant about, um, being non-copyleft that permissive licensing was an ideological choice. In other words, not trying to have, um, you know, making sure you don't have that reciprocal component to the licensing. Um, who over the course of this, of my career that there's one developer in particular, Keith Packard, who came completely around and now is a very strong copyleft advocate because he was burned so many times, um, from having his code be, uh, be proprietary. So it's really, really fascinating. But hearing from those developers directly about their legal problems, helping them, incorporating their organizations, becoming like a really big part of their organization, um, and what they were doing meant that I got to I got to learn why they were doing it. And um, and the philosophy and it really started to sink in. And then while I was at the Software Freedom Law Center, I found out that I had a heart condition that I was born with. Um, it's called hypertrophic cardiomyopathy. It means that I have a big and really thick heart, but it's like, uh, mine is particularly thick. It's, uh, it's like about three times the size of parts of it are about three times the size of a normal person's heart. And it was totally fine. I didn't have any symptoms. I happened to find out about it somewhat accidentally because I was getting migraine headaches. Neurologists, they heard I had a murmur. Anyway, so I found out about my heart condition In and um, and the, the main thing about it was that I needed to get an implanted pacemaker defibrillator just for the defibrillator part so that, um, because I was at a very high risk of sudden death. And so, um, like, being technical and being diagnosed with this condition and knowing that I need to, to get a device, it made me research the device. And I really wanted to learn about the software on the device. And I hit a complete dead end in my research where, um, the companies had were not able to share any of the information with me to help me understand how my own defibrillator would work. And it was funny because talking to the doctors, um, you know, just I just asked them, you know, what do you know? What can you tell me about the software in this device? And my the first electrophysiologist I had, his response was software. Like, what are you talking about? And I explained, this device is run by like there's software on the device. It's critical to how it operates. And he had he had no idea. He was like, no, it just works. He's like, no, there's there's details here you probably should worry about. And then, um, he called the, you know, the device representative, the technician, the tech representative who was there at the moment. I got lucky in the office and that representative said software. So it was this funny, like, you know, it was a long time ago and, um, I think people hadn't really thought very much about software. And so I was, I think the introduction to these issues, uh, the electrophysiologist that I was working with, got so upset when I started to explain the things that could go wrong with software that, um, and why I was so nervous about it. This is on a follow up phone call, and I explained why I really wanted this information. And, um, he got so upset that he hung up on me because he implanted, you know, sometimes multiple defibrillators on a single day. And I think the idea of anyone asking questions about its safety and efficacy implied that somehow he wasn't doing a good job. But, you know, it was a really interesting education for me. It was a very bleak moment and very upsetting. But, um, but I realized that the way we think about software as human beings and as professionals is, is, is really different than the way that the technology actually works and what we can rely on. So I really thought that, um, I became really passionate and really convinced that we need to be able to audit the software. We rely on the idea that there would be software in my body, implanted in my body, and screwed into my heart was and that I wouldn't be able to see it was just completely baffling, just completely baffling. And so it caused me to really advocate for the transparency of these devices. I wrote a paper. Um, you know, I started doing, uh, other work. I filed Freedom of Information Act requests with the medical device manufacturers to see what I could find. I mostly hit brick walls, but the advocacy component was really important. And, um. And as I lived with my defibrillator, I, um, I had other circumstances that made me realize that it wasn't just about transparency, so. So I did all this work at the Software Freedom Law Center. I know you're trying to get some narrative of my my timeline here. So so I kept working at the Software Freedom Law Center. I became general counsel there. Um, and, um, I still provided legal advice to the clients of the Software Freedom Law Center, which were really amazing organizations in and of themselves. And, um, and over time, um, one of those clients recruited me to be their executive director. So I went over to the GNOME Foundation, and GNOME is, um, is desktop software. So if you're running a Linux machine. The most common desktop is GNOME, and so they had just launched GNOME three, which was this. Really it's the it's the format of GNOME now and it's just really shiny and beautiful. And they needed someone to come in and and help push the foundation forward. And so they recruited me. And so I went over there and um, and while I was working at the Gnome Foundation, I had, um, I was pregnant and I was set to have, um, to have a baby. And while I was pregnant, my heart palpitated, which happens to about like a third of all pregnant people, there's like, you know, a third of all pregnancies. You or maybe something like that. It's super, super common. Um, your heart, like, beats a little bit faster, has palpitations. Um, when I give this talk in a whole room, all the women are usually, like, nodding their heads. Um, and it would be fine if you were pregnant and went to the doctor and said. I've had my heart beating. Funny. They'll say, oh, you're pregnant, it's palpitations. Let us know if you pass out or if you feel lightheaded. Right. And none of those things happen to me except that I had a defibrillator. And my defibrillator thought my heart was in a dangerous rhythm. And even though I didn't need it to, it shocked me unnecessarily. Repeatedly. And the only way I could get it to stop shocking me was to take medication that slowed my heart rate down so much that it was hard to climb up a flight of stairs. And being pregnant was a temporary condition. I've done it twice. The babies are fine. Like it all worked out great. But I it was really, really eye opening because it really stood for the proposition that our technology may not be made for us. Device manufacturers desperately don't want pregnant patients getting shocked. What a nightmare that is. Definitely not what they want. But only 15% of defibrillators go to people under the age of 65. So the number of people who are pregnant with defibrillators are teeny, teeny, teeny tiny. So yeah. So that's you know, it just stands for the proposition of like our technology may not be made for us. And what will we do when it fails? So, um. Yeah, that's, uh.

00:24:44

Elisabetta Mori

Yes.

00:24:44

Karen Sandler

Your story puts us on mute for one second. I'm sorry.

00:24:48

Elisabetta Mori

So your story made me, uh, you know, at the beginning, we discussed also, this made me think of, um, Marimo, who, you know, and wrote also a nice article with about her pacemaker and how her, as a someone who is an expert in cybersecurity, realized how, you know, uh, vulnerable she was by having some medical implants. Um, so This is make me making me think also a lot about. So this was something that happened to you in 2006. Now almost 20 years have passed. How do you see, uh, everything around around implanted medical devices. About. Around what? You, as you call yourself cyborg. So what's the current state?

00:25:52

Karen Sandler

So it's so the defibrillator. I got that 20 years ago, but the pregnancy happened later. So it's been this like. And so it's a really fascinating to me is that I previously thought it was only about transparency and auditability. Then I realized that it was all about, um, control. And you know, who has control over the software that is that we, that we fundamentally rely on, right. And then, um, and then most recently last year, as is typical with my heart condition, I wound up getting an irregular heartbeat. Um, it happened to my dad from when he was 32. I was a little older. Um, but it's quite common. And all of a sudden, I needed to know when I got an irregular rhythm and if it was safe to travel because I had an important trip. I was getting an honorary doctorate from Calvin, and I really didn't want to miss that, because that's kind of a once in a lifetime thing. But I got this irregular rhythm and had to travel two days later. And do you know that because the device manufacturers representative was out of town visiting family in India and they did not have an alternate person, I could not get my defibrillator interrogated. And all of this major metropolitan city that I live in, wild stuff. And I have this defibrillator, um, in particular, because while it's, um, it's a large defibrillator in. Sorry, it's a large manufacturer in Europe. It's, um, it has a much smaller imprint in the United States, but I use them because, um, these devices are totally interference. They can be, um, you know, they they researchers have shown that they can be, um, you can run their batteries down. You can have them shock unnecessarily because they're all broadcasting remotely. And, um, and the encryption on these devices has historically been terrible. There have been improvements lately, but, uh, but really just awful stuff. And so I found this one manufacturer where you could disable the, the broadcasting and software and the, the others all just stay broadcasting consistently. So at least with my defibrillator, you have to they call it like a programmer. You have to at least be right on top of me in order to read it, and I'll be able to see it, whereas with all of the other ones, or if or if this one, if you enable the, um, the radio telemetry, you could be all the way across the street. And as I said, I live in a major city. So like it's so densely packed, the person could be anywhere and interfere with these devices. So I need this manufacturer. Um, and we may get to the fact that I run a diversity initiative, um, for folks subject to systemic bias, but because of some of my work, it makes some people very angry. And I think just being a very prominently like just a prominent woman in technology, it means that, um, I have a lot of unpleasant experiences, which includes, uh, people, uh, who want to enter, like who I've got. I'll just say I've gotten rape and death threats. And so it's not theoretical to be worried about somebody interfering with my defibrillator. So I really needed to get the device. However, it meant that I'm beholden to this one company. None of these devices are interoperable. You can't use the interrogator from one device to read the information off of another. And so I was completely unable to get the information off of my defibrillator in my own city. And, um, and then I was in Belgium for the, um, for the honorary doctorate, and I was able to go to a Brussels hospital, and they were able to interrogate my device and get the information off. But it was just this horrifying moment of being so helpless because not only could I not see the code in my own body, not only could I have no control over it when it was, uh, when it was inappropriately shocking me and I needed to take unnecessary medication to stop that. But also, I couldn't get the very information that I needed to make really important choices about whether it was safe for me to travel or how it was safe for me to live my life. So, you know, having this, it's been like every I'd say like 5 to 7 years. There's like some other major lesson in technology that I'm learning from living with this defibrillator. And now I have a second. So this is my second defibrillator. The first one lasts to ten years. My current defibrillator, they predict to live 17 years to exist for 17 years. And when you think about the life of our technology, 17 years is such a long time. We don't even know what the world will be like in 17 years, let alone, I mean, much of our technology becomes obsolete in just a couple of years. So the idea that we're making these choices that we're going to have to live with societally for a long time is really important. And at the same time, I realized that this technology that is sewn into my body is a metaphor for all of the other software we rely on. It's not really about medical devices so much as it's about everything that is critical in our society, whether it's our stock markets, our, um, you know, our basic infrastructure, uh, the, the stuff we rely on to educate, um, our kids like our, our, our, you know, technology that we use for our banking and for, uh, and for our basic communications, like, and we're combining everything with everything else, and we're only as safe as our weakest link. So I'd say that all of the things that I talked about, it's now almost 20 years ago, I thought a lot of what I said felt really, you know, theoretical and space agey. Um, and it felt like it was very unlikely for that to, I don't know, it just felt so theoretical. Every passing year becomes the reality of what we live in, and every year it becomes more and more critical that we take control over our society, of our software as a society. And unfortunately, what's happened is that over time, while we have more open source, the more open source and everything, there's Foss, you know, there's the Linux kernel is in everything. It's in refrigerators and toasters. And, you know, we use it. You can no one would ever imagine launching a business without using free and open source software that economists just valued it at a ridiculously huge at at a, at a value that is more than, um, than the value of Microsoft for example. So by a lot. And so like it's so critical. But the freedom we have the choice the you know, the control over our technology is so much less than what we've ever had before. It is near impossible to get along, like to replace the software on any device that you own. It's impossible to repair the software on devices that you own. And our ability as a society to move from one product to another is almost negligible. It's, um, astounding to me at this very moment. Um, just recently in the United States, the United States passed a nonpartisan, bipartisan bill to, uh, to require the sale of, of of TikTok in the United States to a US entity which solves none of the problems that, you know, like, it doesn't actually create any ethical technology. So people are waking up to this idea that there's a problem with our technology and that we need to be, you know, we need to be doing things differently. But they they don't know how yet. And what we need is software freedom. What we need is decentralization. What we need is encryption. But these problems are hard and they require activity on a mass level. We need to have a lot of group like group collective action. And that is that that takes that takes work and that takes money, and it takes the inconvenience of the business models of companies in the tech sector, all of which are. So I would say, like if you're asking me how I feel about it now, with respect to looking back, I'd say I am more right than I ever wanted to be. And, um, which really saddens me. And, um, and the state of our technology is so much worse than I expected it would be now. But at the same time, I have so much more hope than I've ever had before, because I think that people are starting to get the fact that something's wrong and they're understand the fact that our technology is so important that we can't just sit by and let companies subjugate our own digital rights to their quarterly profits.

00:34:21

Elisabetta Mori

Thank you. So, um, let's go back to the time you were working for the Gnome Foundation. Um, in 2012, you got married, but at the same time, another important activity there was, like, you also worked, uh, at gnome, uh, in the outreach program for women. So, uh, can you talk about those years and, um, what brought you to work for the Software Freedom Conservancy a few years later?

00:34:55

Karen Sandler

So the GNOME actually started outreach program for women when I was at the Software Freedom Law Center, and they asked Sflc for legal help in setting it up. So I've been involved with, uh, and other lawyers were more of the primary contacts on that. But I've been involved in, um, in this program since its inception, which is pretty exciting. Um, and it started in 2010. Um, and it started because I was participating in a program called Google Summer of code. And Google Summer of code is an internship program that Google runs for open source projects where, um where students are allowed to get these, they get these paid internships where they work remotely for a few months. And at GNOME, they had 181 applicants, and none of them appeared to be women. And so my good friend Marina was tapped to, um, to kind of look and figure out whether there was. Well, actually, there was another program in 2006 that was started then, but then nothing happened with it. And then in 22,009, Marina was tapped to sort of like she was very involved with the Gnome Foundation. She was a gnome shell developer, and she was tapped to to create something to help with this. And so she came up, she re revamped that whole program and launched an outreach program for women, um, which started with six interns, um, in 2010, um, working um on various aspects of GNOME and I came over as executive director and it was just, you know, whatever the professional equivalent of love at first sight was. And Marina, um, we jumped into it together and we immediately started building the program together. We expanded it to other communities, um, and, uh, and, uh, and slowly grew it. And, um, I had been a, um, a co-founder of the Software Freedom Conservancy from when I was at the Software Freedom Law Center. Um, I was the one who filed the incorporation papers and, um, uh, and it was an initiative of the Software Freedom Law Center to launch Software Freedom Conservancy. And so I had been a volunteer throughout the time I was at GNOME and Software Freedom Conservancy, which is a fiscal sponsor. It's the umbrella organization, and it's the home of many free and open source software projects. Uh, it also, uh, we do three different branches, which I guess I'll explain in a, in a little bit, but, um, but Software Freedom Conservancy was actually the first, uh, organization that wound up participating with its twisted project in outreach program for women early, and we realized we could expand it. We expanded to Linux kernel. Um, and soon the program was a larger program, um, where we expanded its criteria. Um, we ultimately renamed it as Outreachy. And, um, now we um, uh, now, uh, we, we provide internships not just to women, uh, and actually not all women now, because the program is only open to people who are subject to systemic bias and are impacted by The representation. So we seek essays from people and they tell us, like, we don't presume to understand all of the discrimination and systemic bias that's out there in the world. So we have our applicants tell us, and the essay is not graded based on how beautifully it's written, or even if the person uses correct grammar and spelling. For us, it's just we want to hear about people's experiences with their own, um, how they're impacted by systemic bias and impacted by discrimination. And now we've run the program all of this time. And, uh, um, it's it's been an amazing journey. And Noam launched outreach program for women, but as it was rebranded into Outreachy, um, and as it, uh, as it grew, it moved over to Software Freedom Conservancy. And it was funny because I had already left GNOME as executive director and came over to Software Freedom Conservancy as executive director, kind of swapping my volunteer role for my professional role and vice versa. So I became a volunteer for GNOME, and I joined its board of directors as a volunteer, and my executive director role was with Software Freedom Conservancy, um, which was which was super fun. And then, uh, and then on the same path at the same time, Outreachy moved from GNOME to Software Freedom Conservancy. And it was fun because I basically said, well, I can't be part of that decision because I'm completely conflicted. I'm on the board of directors of GNOME, but I'm executive director of Software Freedom Conservancy. And so Marina handled that evaluation herself. And we I lined up lawyers directly to advise, to advise Outreachy and to advise Software Freedom Conservancy. Um, and so we had um, it was it was a really interesting little thing. And then in 2014, I think it was, uh, outreachy moved over to Software Freedom Conservancy, where it is now. And so Software Freedom Conservancy is my absolute dream job. I call myself the luckiest cyborg lawyer in the world. Um, because the main three things that our organization does is we, um, as I said, we're a fiscal sponsor, so we recognize that no one can leave using proprietary software if there's nothing to switch to. So we help support the creation of free and open source software. Then we have a second what we our logo is a tree. So we talk about our branches. So our second branch is Outreachy where we provide these opportunities. Um, because our software will never be made for everyone if it's not made by everyone. And a good example of that is the way my defibrillator shocked me, because I was not part of the anticipated use case. Right? There are all these great little videos that were flying around Twitter. Once upon a time where they were like soap dispensers and, um, you see a person with a light colored hand, a light skinned hand put their hand under the soap dispenser and soap comes out as you would expect it to. And then someone with dark skin puts their hand under the very same soap dispenser, and nothing happens until they put a white paper towel on top of their hand and the soap comes out. And it really, it's just very clear that there was no one on the team that made and tested that soap dispenser who had dark skin. So until you make your technology diversely, you probably won't make. You won't be able to anticipate all of the use cases that are happening, even still. Technology needs to be changeable. It needs to evolve over time, but you won't stand a chance if you're just making it with a small group of people. So that's our second branch. And then our third branch is standing up for, um, for copyleft, which as I explained earlier, it's sort of the, um, this fun licensing regime where instead of using copyright to keep software a monopolized small, you know, monopolized way to extract royalties from people. Instead, you use that copyright, but to require sharing. And so it's, uh, detractors used to call it viral, but, uh, but it's, uh, uh, you know, a forever free model, and, um, and it it it what's neat about copyleft is it is actually a software right to repair. So it basically gives millions of people rights to fix their devices. It gives you the right to ask for, um, for the source code. And especially if you look at one of the most popular copyleft licenses, GPL, the GPL v2. In that license, it says that they must provide not just the complete and corresponding source code, but including complete and corresponding source code, including the scripts to control compilation and installation. So you must be able to replace your software. You must be able to repair it. And that's really that's our third. Our third chapter. We do things like, um, we filed a lawsuit against the television manufacturer Vizio, which is a very large TV manufacturer in the United States. Because they were not following the rules, they weren't making their complete and corresponding source code available. And so, um, when they didn't respond to us asking them very nicely, we filed a lawsuit as a consumer, as a purchaser of devices, because they are the ones who know that the device is not in compliance and also have an acute need for it. These TVs surveil us. They collect a lot of information about us, and they also could be useful to do other things than what the device manufacturers think we should be using them for. And because of these licenses, we have a right to that software. So we're trying to connect the dots on on copyleft and how critical that is to the things that that folks do. And so I'm really feel super lucky to work at Software Freedom Conservancy.

00:43:16

Elisabetta Mori

Thank you. Um, you also got several, uh, awards. Um, in 2011, the O'Reilly Open Source Award, in 2017, the Free Software Award for your work promoting software freedom, and in 2023 the honorary doctor in Belgium at Katholieke Universiteit Leuven. And you were, uh, you You mentioned it earlier, but, um, can you tell, uh, can you tell us, uh, first, did you expect it?

00:43:59

Karen Sandler

No. It was a huge surprise. Um, and it's probably the most meaningful, like those other awards. Very nice. Um, you know, as an adult, you don't really have the opportunity for people to say good job to, like, get recognition for what you do. So they were very meaningful to me. But, um, but for the, um, for the Kai Levin honorary doctorate, it was a total surprise. And the best thing about it is that it was the students who decided I should get it. So they do five per year, four are nominated by faculty, and they're usually, um, folks that are working on things that the academics at Kai Levin are building on from their work, for example. Um, and then the fifth one is the students nominated, and it's a 60 000 student university. So presumably some of the students want to give it to, you know, like Taylor Swift. So like or somebody like that. So what was really neat about it was that, um, it reflected the fact that there were students at the school who understood the value of free and open source software and understood how important the concept of software freedom is, then that they had encountered my work and that they found it interesting and inspiring and that they wanted to collaborate with the movement was really, really huge. And then the fact that the students that didn't know about it, which presumably was most of them, that they could understand why the work was important and that it was important and that they cared about it, I think was the most incredible thing. Um, I've done a lot of, you know, interviews over the years. There have been, um, articles and documentaries in most of those works. There's something wrong. There's some fact that's wrong. Some like some thing that they didn't quite get right about what the work stood for, or even just some basic fact. But in the movie that they made and in the the explanations that they gave about why they wanted to give me the award, they got everything 100% right. And I was so struck by that because they really took the time. And what it stands for, for me, is the fact that something has fundamentally changed, that young people care about the state of their software, they care about who's making it. They care about what it will mean for their lives down the road. And it, you know, and and also what they think is valuable that they want to spend their time. And when I give a lecture around the, um, the doctorate, um, there was great participation from folks in the law school, in the engineering school, in computer science. So it was just it was it was really, I'd say, one of the most meaningful things because it made me feel like I'm not just on a hamster wheel. Like I'm not just shouting into a void. Um, trying to get things to change. That will never change. But in fact stands for the fact that they will change because smart young people care about it. Smart older people also care about it. And they're retiring like we're about to like we're about to have this explosion of productivity. And it's I just can't I just can't wait. It's such an exciting time in software freedom and in technology in general, because while we've created this massive dystopia, we're finally in the change the part where things can change.

00:47:16

Elisabetta Mori

Um, so if you look about at the future, um, what are the things that you can see that are going in the right direction? And what are the things that we really need to be careful about?

00:47:35

Karen Sandler

Yeah, I mean, I'd say like. It's funny because one of the things I'm most proud of, of my career is the fact that we had more than a thousand students come through the outreach program. Um, like all these opportunities for folks that are subject to systemic bias who wouldn't ordinarily be included. And I think that the like software freedom stands for the proposition that we can democratize our technology, but in effect, we haven't really done that very much. Um, you know, most of our technology creation is very homogeneous, and it's done by a very centralized small group of companies. So looking forward, what what I want to what I, what I'm hoping to see is this explosion of contributions from folks who have had enough, you know, who are willing to like it's really hard to change what you do. It's really hard if you're used to certain, you know, if you're used to having certain apps or features on your phone or on your computer. It's really hard to go back, but people are willing to, you know, um, one of my kids is now 11 and like quite a few kids now have flip phones and are happy for it. They've read the articles saying that social media causes, uh, depression in teenagers. They've read the fact that screens and the use of smartphones interfere with their brain development. And they don't they don't. They don't want it. That's what foreign grown ups do. And they don't want it anymore. Right. So looking forward, I think that, um, you know, the sharing of information where we're able to critically look at where we're we are a data like a data driven society where we look for, you know, uh, concrete aggregation of information about the impact of technology and where, um, we're seeing what doesn't work. We are seeing an increase in a willingness to regulate. I was astounded that GDPR was able to happen in Europe. And so happy and grateful. And I think we're going to be able to I think we're going to see more regulation. But the downside is that there's a contraction of funding across the board. There's less money coming into software freedom. And there was a co-option of the software freedom movement by companies that were wanting to extract the value of open source. And so we're going to have to get over this, and we're going to get over this through a really catastrophic I mean, it really looks like we're in this catastrophic time where funding is contracting. Companies did massive layoffs. They're very focused on their profit centers. And so I think things are going to get worse before they get better. But I think through all that pain, I think we'll emerge on the other side with a willingness to have more, um, you know, the societal collaborations that we need for our software. And I think this, you know, well, it'll be very interesting to see what happens with generative AI. Um, you know, and when when that bubble likely bursts, that'll be a really interesting, um, you know, situation to see. So, you know, there's it'll it'll be it'll be fascinating. And I hope that, um, you know, I'm sort of in the middle of my career and I'm hoping that I can be a part of I can I can hold out long enough to be a part of the real, of the real change where things actually get a lot better.

00:50:57

Elisabetta Mori

Uh, is there anything you would like to add that we haven't covered today?

00:51:03

Karen Sandler

I would say that if you are able, you should donate to these causes that you care about. Where software freedom Conservancy. So SF Conservancy org. Please, please, please sign up as a sustainer. But the situation is getting dire across all of the nonprofits in our space. Our technology has never been more fragile, and we need both the advocacy and the technology creation. So if you feel you could do that, you should also speaking up. Um, it's becoming much harder to stand up to discrimination and systemic bias. Lots of companies, for example, have cut their, um, their funding for because they don't want to support anybody's woke agenda, which I completely understand. But it does mean that these opportunities there are fewer and far between for people to get involved. So getting engaged, you know, I think is is really important. And I know this is meant to be archival. So it's like living I'm talking in this moment of time. And I'm hoping that in the long run, you people of the future will say, ha ha, what an interesting time it was in the 2020s, and now we've solved so many of these problems, and I, I hope we have, uh, but I guess the, the main thing about, about our particular time is that we all need to get it's so dire that every single individual needs to get involved, and it needs to happen now.

00:52:28

Elisabetta Mori

So we are almost done. Uh, but I hope we have other chances to chat again. Maybe on Thursday. Uh, one last question. So at your website, punk rock. Com it says you're available as a DJ. Are you still available?

00:52:50

Karen Sandler

Oh, boy. I you know, I was actually just asked to DJ this past, uh, I was asked to DJ for an event in August. I maybe, maybe I it's very fun. So I, I haven't done it in a little while, so I'm kind of rusty.

00:53:08

Elisabetta Mori

Okay, okay. So, um, I hope we'll have another chance to chat again soon. And thank you for now, and it's been a real pleasure talking to you today. Thank you very much for what you're doing.

00:53:27

Karen Sandler

Thank you. Elizabeth. I'm so happy to be here.

00:53:30

Elisabetta Mori

Thank you.

Powered by TheirStory