Source: Shutterstock

Interview with Daniel Roesler

Shadow of the Valley

--

Daniel Roesler is the co-founder and CEO of UtilityAPI, an energy data software service based in the San Francisco Bay Area. In his spare time, he develops security and privacy applications and volunteers for the privacy advocacy group Restore The Fourth, which rose in response to the Snowden revelation’s exposure of mass surveillance in June 2013. They oppose unconstitutional mass surveillance by the government and defend the rights of privacy and protection against unwarranted search and seizure. Daniel, welcome.

Listen to the full interview here!

Daniel: Howdy.

Tal: How are you this morning?

Daniel: I’m doing well. How are you?

Tal: Great. Thank you for joining us. I appreciate you taking the time to do this. I want to start off by asking a little bit about the work you do. It seems to span quite a few different areas that aren’t often thought of as connected but they’re fairly important in different ways to understanding the challenges of our times, namely green energy and privacy. If you wouldn’t mind just taking a moment to tell us a little about what it is you’re doing and how you see the two as connected or perhaps not connected.

Daniel: Sure. Well, I’m going to start off a little bit by giving a little bit of background on myself. I grew up in Texas. I got a chemical engineering degree from the University of Texas at Austin and so naturally, like most chemical engineers from UT Austin when they graduate, I moved to Houston. I was a project engineer for an environmental testing company coming out of college. The company was called Clean Air Engineering and we basically measured source emissions, which is basically the pollution from smokestacks. We literally climbed smokestacks, stuck a probe in the smokestack, and measured the chemicals coming out. We were kind of the front line of environmental regulation. I did that for a while, so that kind of basically sets the precedent of my default is in energy and environmental work. The privacy stuff or the privacy advocacy stuff actually came after college and after the environmental work. It’s kind of the … where I’m kind of focusing.

After working at Clean Air for a while, I kind of wanted an adventure and so that’s where I kind of got into entrepreneurism and the start-up scene. I’d been programming just kind of as a hobby my whole life so I just wanted to try that out as a career, and so I joined a start-up in New York for a couple years. Unfortunately, like most start-ups, it didn’t work out, but it really kind of gave me the confidence to be able to … When I see an opportunity go off and not be afraid to jump off the cliff and start a company solving a problem. With that experience, kind of the mesh of the environmental work and the entrepreneurial experience, I came back and I wanted to get into kind of the next generation of energy.

I didn’t know much about it, and so I just jumped in and got a job at a normal project development company. We mostly focused on commercial solar for energy and energy efficiency for school districts and cities in California. I learned a ton about the industry, built some projects. It was a lot of fun, and I kept running into a data problem at the company. It was around this time that I started getting more active in the privacy sphere. It was kind of later on at the start-up and starting to get back into green energy I started to become more active in privacy, primarily because of the amount of mass surveillance that was starting to take off at the time and Snowden really kind of catalyzed that.

Tal: Sure, sure. I just want to clarify that you mentioned a data problem. Are you at liberty to talk about the nature of that problem?

Daniel: One of the big things, and this is going to be pretty high-level and then I’m going to go in a little bit more into specifically what UtilityAPI does. As a general thing, if you think about the way that the energy transition will happen is we need to switch sources of energy over the next 10 to 20 years to something else. That’s a huge transition. That’s 87% of the global energy sources have to switch to something else in a couple of decades, half of a generation. That’s an incredibly huge undertaking, and it has to happen a lot quicker than assets will pay off.

If I have a coal power plant and I am paying for that over 40 years, if all of a sudden I can’t use that coal power plant anymore because of competitive forces in the market or the energy transition happening, I, all of a sudden, have a stranded asset. I’m left with the bill for half of my asset because I haven’t paid it off yet. That’s the biggest concern in the energy transition right now is that we’re actually moving too quickly that we will have stranded assets which, as somebody who’s interested in preventing climate change, I could not care less about but that is one of the biggest issues. Innovation and new technologies have been very slow to adopt in the existing incumbent energy industry. Utilities have been slow to adopt new technologies or clean technologies even though they’re more economical because they are worried about stranded assets.

“It was kind of later on at the start-up and starting to get back into green energy I started to become more active in privacy, primarily because of the amount of mass surveillance that was starting to take off at the time and Snowden really kind of catalyzed that.”

Tal: Sort of a sunken cost fallacy.

Daniel: I’m not getting quickly there, but I’m getting there.

Tal: Okay, yeah, but just setting that up, there’s a sort of sunken cost concept there. We’ve already invested so much in coal we want to capitalize on our investment.

Daniel: There’s a great report from the UK that talks a lot about the carbon bubble. If we want to hit the Paris Agreement, the two degrees Celsius, we can only pull up a quarter of the proven reserves of fossil fuels in the world. That’s proven reserves. That’s not even unproven reserves. If we want to hit three degrees, which is where the Pentagon says we start killing everybody due to food shortages and mass migration and all this sort of crazy stuff, three degrees, you can only pull up a third of the proven reserves. There is going to be a huge amount of stranded assets on the books of existing companies, existing fossil companies. That’s going to be probably the biggest issue in the energy transition or the biggest friction point is stranded assets.

That means that the grid or the current income that utility operators want to slow down or are cautious or risk-averse around adopting new technologies. What happened, and I learned this when I moved out to California after the fitness start-up, was that all of the clean energy technologies are just starting to sell directly to consumers and bypassing the utility entirely. That’s what the co-op solar revolution is. It’s because they found, hilariously, it’s easier to sell 10,000 solar systems to 10,000 customers than it is to sell 10,000 solar systems to one customer, the utility.

Tal: There’s a kind of alternative grid thing emerging where people are collecting their own energy off of the grid. It’s kind of like where Tesla’s powerwall is kind of being used.

Daniel: Exactly, exactly.

Tal: Tesla’s powerwall being that series of batteries, very high-efficiency batteries that were designed to be used with rooftop solar and so on.

Daniel: Exactly, exactly. You have a lot of contributed energy companies. The technical term for these companies is called DER’s, distributed energy resources. If you ever read any energy industry news, DER is an acronym you’re going to see a lot of because it’s kind of the next wave of energy. Over the past decade, distributed energy resources have really taken off. If I am a building and I am looking at lowering my energy costs, somebody is going to come to me and say, “Hey, if you switch to batteries plus solar, you’re going to save a whole bunch of money on your energy bill.” The next question out of my mouth is going to be, “Well, how much money?” To answer that question, you need to take a look at their historical energy usage data that they have been paying. You basically need to take a look at their historical bills and see what they’ve been paying.

Tal: Got it.

Daniel: In order to calculate how much they would save by installing this product. That friction point is what I experienced at this solar development company in California when I moved there. I was in charge of putting together the proposals, the RFP’s, for various school districts and cities across California. Say I’m a school district and I have 20 schools, and I want to get a quote from a solar company on how much I would save by putting solar on some of those schools, at which schools. Putting together that proposal and calculating out what the savings it would be based on the past usage, getting the original data set of the historical data for those 20 schools really sucked. It was incredibly hard. It took six weeks, and it was just an incredibly painful experience and it slowed everything down.

“If we want to hit three degrees, which is where the Pentagon says we start killing everybody …you can only pull up a third of the proven reserves. There is going to be a huge amount of stranded assets on the books of existing companies, existing fossil companies. That’s going to be probably the biggest issue in the energy transition or the biggest friction point is stranded assets.”

Tal: That’s where your company comes in.

Daniel: Exactly. That’s where my company comes in is we basically make it easy to download and parse a customer’s utility data in order to do a feasibility analysis or to prove savings after an asset is installed. If I have solar installed and I’m a school district, I want to see what my current savings profile is to know whether or not I’m doing something right or doing something wrong or the system’s working or or it’s not or whatever. That ongoing monitoring is another part of it as well. It’s called asset management. The point is that we automate the process of collecting utility data for managing DER deployment. That’s a very technical-

Tal: You’re working … I think it’s quite clear from what you said. You’re involved in collecting energy data, energy usage data all in this attempt to make solar power, essentially, economically incentivized for large consumers of electricity.

Daniel: Half of our business is-

Tal: Large and small, okay.

Daniel: Half of our business is commercial.

Tal: Okay, got it. Basically, your programs are looking to calculate current usage in order to predict savings. Is that accurate?

Daniel: Yeah. We like to say that we do the half of data science that everybody hates, the collection and clean-up.

Tal: Got it. Okay, so you’re quite familiar with data collection and these kinds of things.

Daniel: The big thing to note about energy data is it’s never really been available before. It’s mostly been, like you think of your historical whole utility data, that’s just been on your bills in a database or in a mainframe somewhere in a utility. It’s never been really accessible and with the adoption of distributed energy resources, it’s starting to have to be in order to properly run a grid because in order to get to 80–90% penetration of renewable energy, that’s all intermittent generation. Your utility is going to need to talk to Tesla to operate your powerwall in order to balance the grid. Flexibility is the big key thing, and distributed energy resources are a key factor in balancing an intermittent or high penetration of renewables grid.

Data is going to have to be flowing from party to party when it was not previously. What I’m trying to do at UtilityAPI, what our mission is, is our mission is partially to set the precedent that consent is required in order to access energy data. If you look at the history of different data sets and how different data sets are made available, you have, for example, medical information and that requires a huge amount of consent and it’s very ...

Tal: Because there’s major consequences if those get into the wrong hands.

Daniel: Yeah, there’s major regulation behind it, all that sort of stuff. Then you look at credit card transaction information, and there’s practically no consent required.

Tal: Then we have issues like what happened with Equifax last year.

“Your utility is going to need to talk to Tesla to operate your powerwall in order to balance the grid. Flexibility is the big key thing, and distributed energy resources are a key factor in balancing an intermittent or high penetration of renewables grid.”

Daniel: Exactly. Your credit card transaction history can be bought and sold without your consent. Energy data is something that has never really had any sort of question or regulation because it’s never really been made available or it’s never really been transacted. We are at the advent of determining which path or which road we take with this data. UtilityAPI’s mission is to make sure that it is a consent, informed, driven process much like HIPAA or medical information instead of something like browser history that is bought and sold online all the time. Does that make sense?

Tal: Yes, it completely makes sense and I’m really glad you went through that because now, I think what you did is really make it clear how environmentalism and privacy issues are connected in a way that people normally don’t think about.

Daniel: It takes 10 minutes worth of explaining to get there.

Tal: It’s true but it’s a really important payoff especially, if you’re an environmentalist, you’re kind of more focused on things like maybe social movements or maybe you’re on the side of actually the regulation, like you used to do or actually going and doing reforesting projects and things like that. Privacy, while you might be concerned about it as a citizen, you don’t necessarily think about it as an environmentalist but here you are saying we need to think about that if we’re going to be helping the environment.

Daniel: That’s what we focus on UtilityAPI is the secure authorization and consent for sharing energy data. That’s important. The classic question is, “Why does that matter? Why is energy data something that I should care about?” The first reason is your energy data, especially with the advent of smart meters, is incredibly revealing about your private life. Whenever you see 15-minute intervals and you see an uptake during the middle of the day, is that because your husband or wife is coming home in the middle of the day without telling you and possibly seeing somebody at home? If your grandmother makes tea at 2:00 p.m. everyday and one day that energy spike doesn’t happen, should you call an ambulance? You are growing weed in your house-

Tal: Especially combined with other metadata, it can start to paint the picture of what’s going on in your life. Not necessarily that you’re going to get everything just out of the energy, I guess I suppose that could happen to a certain degree, but without that kind of a privacy combined with all the other kinds of data that’s being collected, suddenly this richer picture of who this person is and what they’re trying to do comes into question or comes into clearer view, especially if you start to go further and further out in terms of your viewpoint. When you look at the whole system and you look at the whole grid and you see all of these different patterns in energy usage and so on.

“Whenever you see 15-minute intervals and you see an uptake during the middle of the day, is that because your husband or wife is coming home in the middle of the day without telling you and possibly seeing somebody at home? If your grandmother makes tea at 2:00 p.m. everyday and one day that energy spike doesn’t happen, should you call an ambulance?”

Daniel: Energy usage is incredibly … It correlates your behavior fairly well, and so you can gain a lot of behavioral information off of energy usage data. There is a company that works with fast-food chains, and they monitor their energy data and it’s very easy to tell when an employee is staying after hours or something like that in order to, I don’t know, do something shady. It’s incredibly valuable for a fast-food chain, and it’s not energy-related at all. The goal of this company is not to necessarily reduce their energy usage. It’s to monitor for rogue employees.

Tal: There’s a lot of great forks where this can go from, but I want to kind of pull back a moment and then maybe we’ll come back to this because I want to put this into context of what Edward Snowden came out and spoke about back in June 2013 with regards to mass data collection as a part of the surveillance program. It’s a little unfortunate but still some people will think, “What’s the matter with that? If I have nothing to hide, why should I care?” I think what you just said kind of highlights it, so I want to present that as a way for you to continue talking about what you’re talking about. Let’s put that in the context of what he talked about. Perhaps also we can throw in what may or may not be going on with Russia, which we’re not totally clear on at this point in time as we’re doing this interview. There’s obviously those allegations of interference and cyberwarfare going on.

Daniel: There’s a couple of things that I’d like to kind of touch on around that area. First of all, to address the “I have nothing to hide, so why should I care?” question, a lot of people try to argue or rationalize that by saying, “Well, everybody has something to hide, blah blah blah” but I generally think of it as being the wrong question to ask. My general response is, “Well, I don’t get the premise of the question” primarily because I don’t think that … It’s not your right to privacy, it’s society’s right to privacy. The goal of privacy is to allow people to organize in private so that they may voice opinion and cause political change and stuff like that. It’s a safeguard against authoritarianism. People have the right to privacy. It’s not their right-

Tal: As an individual, as an individual.

Daniel: Individual. You’re just kind of-

Tal: Because it’s a collective right.

Daniel: Congratulations.

Tal: You have your due privacy whether you want it or not is what you’re saying.

Daniel: Exactly. It’s like your-

Tal: It’s kind of like a societal … Right, it’s a basic societal need. It’s kind of a red herring to look at it as if it were some sort of individual right.

Daniel: Exactly.

Tal: It’s kind of besides the point.

“It’s not your right to privacy, it’s society’s right to privacy. The goal of privacy is to allow people to organize in private so that they may voice opinion and cause political change and stuff like that. It’s a safeguard against authoritarianism.”

Daniel: It’s not yours to give up, so you can’t voluntarily give up your right to privacy because you have nothing to hide. You had it whether you had something to hide or not. It’s there for people who do have something to hide, primarily to allow political change to happen. That’s one of the areas where I like to focus is training political advocacy organizations on how to ensure private communication and organization. The main thing that I do for Restore The Fourth is I volunteer to train or help advocacy organizations in private communications. Now that’s not all that Restore The Fourth does, but that’s what I do specifically. For example, I helped train one of the branches of the Sierra Club to use secure messaging and how to be aware of unencrypted communications, primarily because the FBI spies on the Sierra Club whenever they are organizing pipeline protests. Privacy is necessary in order to organize political activity.

Tal: This is what Snowden often says too, but to boil it down to a few words, there’s no civil society without privacy.

Daniel: Right. You can’t rock the boat without privacy.

Tal: Right. Okay, perfect. Well said. That’s the basis that we’re starting from. Now let’s kind of elaborate on the consequences of these a little bit into some areas that maybe people know a little bit less about, particularly the cybersecurity realm with regards to, say, anything from criminal actors to state actors getting access to this kind of data and the sorts of things that they might be able to do that would be detrimental to both individuals and groups as a result of having access to this data. Also let’s make sure we keep it in the context of recognizing that once it’s out there and available, it becomes very difficult to clamp down on. I’ll let you talk now.

Daniel: Sure, absolutely. There’s a couple of points that I’d like to touch on that. First of all, I’m pretty cynical on the culture right now in regards to cybersecurity for a couple of different reasons. I’m also fairly cynical on the culture of the government in regards to protecting or looking at cybersecurity. One of the big things that happened in the 90s was that cryptographers and the NSA actually got along fairly well, where there was a famous incident where the NSA submitted several various primes to the cryptography standards body and they looked at it and they thought it was a back door but it actually was hardening, or making it harder to break, the encryption. Back in the 90s-

Tal: For those of us out there that aren’t familiar, what are primes?

Daniel: I’m sorry. Just various numbers. In cryptography you have various numbers that you pick in order to do your encryption, and they submitted some numbers that some people thought were broken. If you pick the wrong number you can break the encryption, but it turned out cryptographers at universities later figured out that those numbers were actually making it harder to break the encryption because they were chosen better. It turned out that the intelligence apparatus was actually helping out the private sector in order to make it more secure.

Tal: Interesting, yes.

Daniel: It was favoring defense over offense as just the general theme. It was, “Hey, we want American businesses to be secure so they don’t get hacked. Okay, so let’s help them out.”

Tal: Makes sense, right.

Daniel: Even though that comes at the cost of us making it harder for us to hack these American organizations or these organizations as a whole, we still count it as a net gain if they don’t get hacked by others. They prefer defense over offense. That changed after 9/11, where you had a huge amount of influx into the intelligence apparatus to be able to collect as much information as possible to find the needle in the haystack. The attitude changed pretty dramatically into favoring offense over defense. If you find some sort of exploit in an American business’s apparatus, for example, if you find a way to hack into the DNC email servers, are you going to tell the DNC? No, because you want to keep that exploit to yourself and possibly use it in the future if if you need to. The attitude, the general culture in the intelligence operations nowadays is, “I want to keep exploits that I find secret because I favor or I value those exploits over the value that the business would lose if it were exploited by somebody else.” It’s a trade-off because there’s no such thing as a golden key to give just you access to an organization and not anybody else.

Tal: Once the code’s broken, it’s broken.

Daniel: Exactly.

Tal: Anyone with that key can get in.

“Even though that comes at the cost of us making it harder for us to hack these American organizations or these organizations as a whole, we still count it as a net gain if they don’t get hacked by others. They prefer defense over offense. That changed after 9/11, where you had a huge amount of influx into the intelligence apparatus to be able to collect as much information as possible to find the needle in the haystack.”

Daniel: Even Russia can break into it, if the Russians find it. Hoarding exploits has become kind of the norm as opposed to reporting exploits. That’s very frustrating as an American technology business owner because, in all likelihood, I would love it if-

Tal: Just to kind of pause here for a second and make sure everyone’s with us, an exploit would be like a back door to your iPhone, for instance, so that somebody could get into your iPhone, possibly even remotely, and you would have no idea but yet they would be able to, say, listen in to your calls or look at the kind of sites you’re browsing through your phone, what kind of apps you’re using, so on and so forth.

Daniel: Yes, they would be able to … An exploit is something that gives you access to something that you should not have access to.

Tal: They’re collecting all of these for all these different devices and all these different types of technology that we use, and they’re just storing them. They’re not telling the companies that this is a problem. They’re just sort of hiding them away in some sort of database where they keep all of them in case they need them for some sort of spying purposes. Great, okay. I’ll let you go ahead and explain what has happened from there.

Daniel: There is now a gigantic rift, at least in most internet, especially in the Bay Area, a big rift between the intelligence and the government, the intelligence apparatus and the government, or an incredible amount of cynicism because they know that the government does not have their back. The government is actually trying to find exploits in their systems and will not tell them about those exploits when they find them. We’re on our own as far as the US technology industry, and it’s very frustrating because you know that Russia could hack you and the NSA could’ve already found that exploit and didn’t tell you.

Tal: Furthermore, there was a news item, I believe, pretty recently where they found that those servers, those databases where the NSA holds those exploits had been breached. Are you familiar with that?

Daniel: I am not familiar with that, but absolutely, that would not be surprising just because if Snowden can walk out with tons of information, it seems like internal security is pretty lax or at least not favored. Again, that kind of just further exemplifies the offense over defense culture.

Tal: I don’t remember the name of the article off the top of my head, but perhaps in the show notes I can put that in there for those who are interested.

We understand. This is why privacy and unwarranted search and seizure and mass surveillance, this is why these are all problems beyond the sort of obvious things that we would imagine. It’s more than just people collecting your dirty photos or listening in to your calls and things like that. There’s a wider security issue in terms of our social systems and how people could exploit them. We actually even get into the criminal element, how people could do that but I think what we said, you can kind of extrapolate from there quite a bit.

“The government is actually trying to find exploits in their systems and will not tell them about those exploits when they find them. We’re on our own as far as the US technology industry, and it’s very frustrating because you know that Russia could hack you and the NSA could’ve already found that exploit and didn’t tell you.”

Daniel: There’s two other points that I wanted to touch on around mass surveillance. The first one is the general theme in most technology companies nowadays is that they don’t consider data a liability. When you collect data on users, you try to collect as much as possible because you want to use all the latest, greatest, sexy technologies like AI and machine learning and blah blah blah, and there’s not really any downside to it currently. If you get Equifax and you get hacked and all of your data gets dumped to the Internet, you still don’t go out of business. There’s not a consequence to getting hacked, really. Yahoo got all of their email hacked, and they didn’t really take that much of a stock hit.

There’s not really any sort of consequence to collecting data, and that’s unfortunate because the other thing is back, as the Snowden leaks revealed, the government is actively looking for sources of data to soak up. If you collect a lot of data on your users, you are a target. That counts for not just private companies but also local and municipal companies. Your city with their database on license plate readers or video cam footage from their cop cars, their police car cameras, that is currently, and this is one of the things that Restore The Fourth is particularly focused on, is local data collection that is then being shared with federal data centers. All of that is being fed into things like facial recognition engines and all of that sort of thing, so when you go to a protest you know exactly who these people are and when they were seen last and their last six months’ worth of movement across the country. That is an incredible thing that we’re focused on at Restore The Fourth is particularly on getting all of these data sets into the government’s hands is kind of the trip.

Now companies generally don’t really think about that whenever they start collecting massive amounts of data but as the Snowden leaks revealed, we are starting to see that pretty much all of these major tech companies and and all the data they scoop up is being made available for data mining by the government, involuntarily or voluntarily. It’s not necessarily a voluntary thing by the tech company. Google’s data centers got sniffed by the NSA when they directly tapped the link between the data centers.

“If you collect a lot of data on your users, you are a target. That counts for not just private companies but also local and municipal companies.”

Tal: It’s not that the government is directly going out and collecting this data with their face in front on it. It’s more like they are sort of using the data collection … They’re just sort of sapping the data collection efforts of your Googles and your Apples and your Amazons and so on in order to put it to their own purposes.

Daniel: I don’t know if you’re familiar with the tech activist … He’s Maciej Ceglowski. He has a couple of talks online but one of them is called Haunted by Data, and it talks about how data should be considered a toxic asset or liability instead of a value add asset. You should want to collect as little as possible because it’s dangerous to collect more. That’s kind of a philosophy that we kind of live by at UtilityAPI is we view data as fairly toxic and so we try to handle it with care.

Tal: Explain that a little bit. For the companies, why does it become toxic? Is it because … Go ahead.

Daniel: We consider it toxic primarily because we, in our terms, don’t claim any ownership over it. It remains the property of the utility account holder and so when it’s on our systems, it’s basically under their control. They can control what they want to do with it. If they want to share it with X company because they want a quote from that company, they can do that but we do not have the right to do anything else with it. We can’t aggregate it or sell it or anonymize it or anything like that. It’s their data.

That restricts us in what we can do, obviously. We consider that a good thing, primarily because we want to have it on our system as little time as possible. If you’re a homeowner or a business owner and you want to get a quote and you need to share your data with that company in order to get the quote, once you get the quote it should be gone. That should be done. It shouldn’t stay on our servers, and so it doesn’t. We are a very transactional-oriented thing. We’re not a data hoarding business. That sort of mindset really puts us in the attitude of when we get data, we want this transaction to be done and over with and move on with our lives. It’s great because it really frees you up from having to worry about handling it in the future or leaking it or whatever just because you know it’s going to be gone in a short period of time anyway.

Tal: It’s kind of like a liability thing in a sense.

Daniel: We consider data a liability. That is pretty rare in the tech industry.

Tal: You have to take care of it, you have to put all this kind of effort behind maintaining it and protecting it and ensuring that it’s able to … Also there’s issues because the data might not be accurate and there’s all kinds of flaws in it and so on and so forth.

Daniel: Oh my God, yeah. You don’t want to build a solar system of a certain size and have it be the wrong size.

Tal: It’s interesting the way you frame that because it may seem like maybe one of the saving graces in this is that the toxicity of data might alternately backfire on these forces that are trying to wield it for some sort of extra power. Is that fair to say?

Daniel: Exactly. If tech companies all of a sudden start treating data as a liability, the government won’t have as much data to scoop up. That’s one thing.

Tal: Interesting.

Daniel: The same thing goes for local cities and states. If I am a traffic light camera vendor for a city and the city has a policy of not keeping the data around for very long, that means that I have to delete the data. The federal government doesn’t have access to historical camera light data for a city because it gets deleted after a certain period of time or deleted whenever traffic violations don’t occur or whatever.

“If tech companies all of a sudden start treating data as a liability, the government won’t have as much data to scoop up.”

Tal: Vice versa, it can also be a toxic liability for a government, say, in terms of how other state actors could then take that same data and use it to manipulate that country’s population.

Daniel: Correct.

Tal: Either through cyber attacks or …

Daniel: We have worst-case scenario stuff happening nowadays where you’ll put so much emphasis on offense and haven’t put any resources into defense and now you’re getting hosed in defense. You’re just getting exploited left and right, and your population’s getting manipulated left and right and it’s because you didn’t put any resources into defense and shoring up your defense. Private sector is trying to catch up, but it’s not anywhere close to the level of resources that you would need to fend off something like a Russia.

Tal: Interesting. There’s an argument here now from sort of a national security perspective that this should be an issue as well. This should be something that we’re paying attention to.

Daniel: Absolutely. You always [crosstalk] about national security. Prior to [crosstalk]-

Tal: Now it’s kind of like the tables are kind of turned whereas at first it was like, “We need to break your privacy in order to secure you.” Now it’s, “We have to secure your privacy in order to protect you from larger state actors.”

Daniel: Right, because if we have access to your information, then Russia will have access to it is generally the attitude that should be taken. In general, I favor or I agree with the pre-9/11 culture of American businesses are more valuable being secure than they are with us knowing everything about them.

Tal: Interesting. Okay. That’s really great. Really, really interesting stuff. I think we’re getting close to the end of our time, and I think we’ve covered a lot of what we wanted to cover. In private we’ve spoken a little bit about a concept that I just want to maybe bring up again to close our session here. There’s an interesting point that you brought up in that discussion about the newness of digital technology and how, essentially, you kind of alluded to this a little bit in terms of the fact that there’s few consequences, but basically it’s that we’re so new to these technologies and not enough has gone wrong with them in consequential ways for people to sort of have focused in on it and said that, “This is really a problem. We have to step in and do something about it.” I was wondering if you recall that part of the conversation and if you could maybe pick up on that, expand a little bit on that idea for our audience. I thought it was really important to keep in mind. It kind of added a nice global perspective to where we’re at historically with digital technologies and their pros and cons.

Daniel: Absolutely. Going back to a little bit on my background, I’m originally a chemical engineer and I did a co-op for a year at a major chemical plant in Houston. My roots are in classical engineering, hard engineering or hard asset engineering, I guess. I’m not a technologist by default. I’m an engineer, classical engineer by default. The thing that always sticks out to me is the huge disconnect in respect for quality between most other fields of engineering and software engineering. In civil engineering or mechanical engineering or chemical engineering, you have to be correct because if you’re not correct, a chemical plant blows up or a bridge collapses or a car fails and crashes into a bus or whatever. You have life and death consequences in most other fields of engineering. You don’t have that nearly as much in software engineering except for very, very few areas.

“You have life and death consequences in most other fields of engineering. You don’t have that nearly as much in software engineering except for very, very few areas.”

There’s a classic article from Fast Company back in 1996 called “They Write the Right Stuff.”

I come at this from the perspective as quality and good engineering are a solved problem. They are solved problems in other fields. We know how to do good engineering. The problem is that we’re not willing to spend the money on it because nobody’s going to die if Facebook goes down for a couple of hours. The world might actually get more productive. If UtilityAPI goes down for a couple of hours, nobody’s going to die. Only in space shuttle software will people die, or aerospace.

We have to start to feel pain in order to demand that we spend more on software engineering or spend more on quality. I feel like we’re starting to get there, but a whole bunch of bridges collapsed and a whole bunch of people died before we started enforcing bridge regulations. We are just at the start of an engineering field that has not really felt the consequences of society depending on it in order for it to actually mature enough to have the level of quality of other engineering fields. For example, I think that we’re starting down that road and the main consequence so far is losing an election because of shitty software.

It all starts with training for people. A lot of hacks depend on human stupidity. They depend on somebody clicking on a suspicious email. Improper training for how to properly use technology is a big aspect as well, which is why I focus on advocacy training because I don’t want advocacy groups to fall victim to stupid phishing schemes. We’re just at the infancies of that, but we’re starting to actually see a lot of the consequences of us not putting enough time and resources into properly building good systems. Losing an election is one of the major consequences in recent times that we have seen. There’s a lot of articles on how voter registration databases have been hacked and we’re not doing anything about it, or at least not currently.

“We have to start to feel pain in order to demand that we spend more on software engineering or spend more on quality.”

We’re still in the bridge collapsing, a lot of people dying, feeling the pain sort of phase. We’re not yet to the “Hey, we actually need to do something about this” phase. It’ll probably take hundreds of years for us to do that. It took bridge builders hundreds of years in order to do that.

Tal: Interesting.

Daniel: That’s my kind of perspective being kind of an outsider to technology, coming in it from a perspective [crosstalk]-

Tal: Engineer’s perspective, yeah.

Daniel: Good engineering is a solved problem in many other fields.

Tal: Interesting. Very interesting. Well, Daniel Roesler, thank you so much for your time and for your insights. This has been a very insightful talk. I think we’ve all learned a lot today about privacy and the connections between energy and data and privacy in the near future. I thank you very much for joining us and hope you’ll come talk to us again. There’s so much more that we could definitely talk to you about and ask you about, and I would love for you to come back on the show sometime. Thank you for being here, Daniel, and that wraps it up for this episode. Stay tuned for the next one and hope to see you guys again soon. Take care.

--

--