Image for post
Image for post

TripleBlind’s proprietary data privacy toolset facilitates aggregation and analysis of data, without exposing the data. It is a perfect solution for companies struggling to replicate the processes that were previously allowed under the Privacy Shield. TripleBlind’s technology allows entities to observe GDPR regulation, and achieve their business objectives regardless of where they or their analysis is located.

According to a recent spate of articles, on July 16th a court decision by the Court of Justice of the European Union invalidated the widely used EU-US data-transfer agreement known as Privacy Shield. In what has been termed the “Schrems II” decision, the EU’s top court ruled the Privacy Shield agreement was not valid. According to the Computer & Communications Industry Association, this decision will cause “uncertainty for the thousands of large and small companies on both sides of the Atlantic that rely on Privacy Shield for their daily commercial data transfers” (Alexandre Roure, senior manager of public policy at the Computer & Communications Industry Association in the Wall Street Journal 1). …

Image for post
Image for post

A couple of weeks ago, I left Ant Financial/Alibaba. I am filled with gratitude to Ant Financial & Alibaba and our global partners for the last 3 years — they have been an absolute blast. No one is enabling global financial inclusion at the rate Ant Financial is, and I’m grateful to have gotten an opportunity to foster that. I worked in China, Israel, The USA, Canada, Colombia, Mexico, Brazil, India, Indonesia, Singapore, Thailand, Malaysia, the Philippines, South Korea, Hong Kong, Japan, Macau, Germany, Russia, The UK, Finland and several other countries. The work has grounded me and helped me understand how enabling global trust at the scale Ant does helps people self-actualize. …

Despite the turbulence in the markets the past two weeks, there’s no doubt that blockchain is the most buzzworthy technology du jour. If you read anything from the the blockchain prophets, this technology is going to completely usher in innovations that will fix everything that’s wrong with money, the internet, corrupt politicians, bankers and everything else that’s broken in our world. Something like that anyway.

The current consensus in the community is the idea that the protocol will do the heavy lifting, and a thin layer of custom business logic will enable apps to run on each of these heavy protocols. If you believe the world is headed to this direction, there’s a good chance the world will look something like this — there will be a publishing protocol, which will host all of your content, with custom apps for each channel that offers the custom functionality. So the NYT, WSJ and the Economist are going to still be accessed through their own apps, but you’ll pay for your subscription to content through the publishing token. …

The infrastructure is here already

In 2016, Joel Monegro published the Fat Protocol thesis. The main takeaway from that post was that in the emerging world of distributed applications (dApps), we have a chance to re-imagine what the fundamental protocols for this new generation of web applications is going to look like. Instead of building apps on top of standardized internet protocols that haven’t changed in decades (e.g. TCP, IP, UDP), we finally have a chance to re-think what those protocols should look like to support the next generation of applications.

While most members of the blockchain community seem to have accepted the fat protocol thesis, the consensus seems to have been that to bring about the next generation distributed & decentralized apps, the infrastructure needs to be there first. That is to say, the consensus appears to be that the killer app built on the blockchain isn’t here yet only because the infrastructure isn’t here yet. …

Fact: We would all like to make better, faster, data-driven decisions.However, with limited resources and time, that can be a tall order.

So–let’s keep things simple. Let’s address how you can use data to analyze socioeconomic and demographic trends in a community over the long term, and, specifically, how mySidewalk can help.

Real-Time Data vs. Historical Data

If you’re plugged in to any of the chatter going on in the civic and gov tech world, you’ve probably heard a lot of discussions around the “Internet of Things” and sensor-generated data–information that is being transmitted and analyzed in real time to produce insights to discover patterns quickly. …

While building the data product at mySidewalk, we’ve spent a lot of time thinking and working on how to best help make communities easier to understand. Our approach in doing that has been to work on building a product that helps save time while working on analyzing a project area, reveal insights that may not have been apparent, and help you improve communication with all relevant stakeholders.

Like my colleague Brian Parr discussed in his excellent blog post a few weeks ago:

A common task while working with spatial data is to summarize data for custom boundaries at different levels of geography.

There’s a lot of hype about big data, cloud computing, machine learning, and data science. From my experience, things get lost in translation when it comes to what is said by the non-technical stakeholder and what’s actually understood. In this piece, I explore the historical roots of these popular concepts and highlight that, while most of the techniques and foundations have been established for a while, the primary differentiating factor is perhaps just terminology.

A few weeks ago, in between flights, I decided to check out the Santa Monica Public Library. While perusing the mathematics and statistics section, I found a book called “Introduction to Statistical Analysis” published in 1951. Intrigued by the title (how was statistical analysis done in that time?), I flipped through the text and deduced that most of the ideas from the book are still the foundation of many techniques we use to analyze data today. …

Fun Fact: Over 90% of all the data we have in the world today was generated in the last two years.

In fact, almost every activity that we do nowadays generates data. Data comes from more sources in greater volume, velocity and variety than ever before. We’re generating data from cell phones, satellites, cameras, credit card transactions, user generated social media content, law enforcement data, etc. Most of that data is generated and transmitted in real time. Collectively we generate more than 2.5 quintillion bytes of data every single day!

In 2012, a quote emerged in the big data world…

One. Understand the problem that data can solve
This may seem obvious, but projects for data analysis seldom come pre-packaged as a clear, unambiguous data science/data mining problem. This is usually an iterative process of discovery, and creativity plays a major role in breaking down a business problem into one or more data mining problems.
The decisions that benefit from being informed by data analysis usually fall into two major categories:

  1. Decisions that require new discoveries within the data.
  2. Reoccurring decisions, made frequently. These decisions can benefit from even small increases in decision-making accuracy based on data analysis.

This is where domain knowledge of the industry and the business is immensely valuable. Knowing which questions to ask guides the entire process. Questions like which data-sets to gather, what kind of analyses to perform on the data, and how to communicate the results, can all be answered iteratively as the problem you’re trying to solve with data becomes clearer and more unambiguous. …

According to a 2006 report from the Institute of Medicine, 1.5 million people were harmed by medication errors; similar studies indicate that 400,000 injuries occur yearly in hospitals as a direct result of medication errors. A large majority of these errors are pharmacy misfills.

The most common prescription errors were due to the pharmacist putting someone else’s medicine to the customer, mostly due to mis-identification of the patient. …


Riddhiman Das

Junior Worker at TripleBlind

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store