LiveRampinLiveRamp EngineeringJoining Petabytes of Data Per Day: How LiveRamp Powers its Matching ProductOur data matching service processes ~10 petabytes of data per day and generates ~1 petabyte of compressed output every day. It continuously…Sep 16, 20191Sep 16, 20191
LiveRampinLiveRamp EngineeringDistributed Tracing at Massive ScaleWhen a developer thinks about monitoring and observability of their production application, two things generally come to mind: metrics and…Aug 5, 2019Aug 5, 2019
LiveRampinLiveRamp EngineeringInitial Release of Workflow2 — LiveRamp’s Big-Data Workflow OrchestratorWe are excited to announce the initial Open-Source release of Workflow2, LiveRamp’s big data pipeline orchestrator!!Jun 25, 2019Jun 25, 2019
LiveRampinLiveRamp EngineeringLiveRamp Hackweek: Serializing and Transmitting Bytecode Between JVMsWhat if we could build “higher-order” big-data applications in the same way we build higher-order functions?Jun 20, 2019Jun 20, 2019
LiveRampinLiveRamp EngineeringMigrating a Big Data Environment to the Cloud, Part 5What comes next? How are we going to re-engineer to take advantage of being on GCP?Jun 12, 2019Jun 12, 2019
LiveRampinLiveRamp EngineeringMigrating a Big Data Environment to the Cloud, Part 4Copying to the cloudJun 4, 2019Jun 4, 2019
LiveRampinLiveRamp EngineeringMigrating a Big Data Environment to the Cloud, Part 3How do we get to the cloud?May 29, 2019May 29, 2019
LiveRampinLiveRamp EngineeringMigrating a Big Data Environment to the Cloud, Part 2Starting the journey to GCPMay 17, 2019May 17, 2019
LiveRampinLiveRamp EngineeringMigrating a Big Data Environment to the Cloud, Part 1LiveRamp is a big data company that is migrating away from their on-premise data center to the cloud.May 13, 2019May 13, 2019