Today is not possible to talk about data without bringing to the table topics such as: artificial intelligence, machine learning and data engineering. Indeed, albeit AI has been researched and debated since the 50’s, only over the past years it hyped and became a part of the business vocabulary.
It can sometimes be hard to see underneath all the buzzwords, but Data and AI have truly been in the forefront of new groundbreaking business models and societal innovation over the past years. It is time for business and technical leaders to take advantage of what AI has to offer.
Having access to internet anywhere and everywhere became essential in our day to day. In the short span of a couple of years, we went from internet being considered a luxury (specially on the go, i.e. mobile) to a everyday commodity.
WiFi became part of our everyday vocabulary, yet, it’s inner workings are still a mistery to many. According with a survey by Symantec, 87% of U.S. consumers have used the readily available public WiFi to connect to the internet (cafes, airports, hotels, etc). Also, 60% of the consumers think their information is safe when using public internet.
I was working with AWS everyday for over a year when the thought of pursuing an AWS Certification came to my mind. Why get certified?
Validating your existing knowledge, challenge yourself and enhancing your resume are always good reasons to do so, but in addition to those, I was really interested in exploring further the AWS ecosystem.
My thinking is that when you choose to move into a Cloud provider such as AWS (or GCP or Azure) you really want to make sure you are leveraging to the full extent it’s managed services such as message queues, analytics, storage, etc…
The usage of internet proxies is quite common nowadays as means to impose restrictions (for security purposes) on how an application or machine can connect to the internet.
The vast majority of applications or libraries provide support for HTTP/HTTPS/FTP proxy. The usual way you can enable a proxy in a Linux system is by setting the environment variables:
Usually enabling this (or in alternative the application configuration) does the trick. However… there are corner cases where this does not work.
I originally started to explore different alternatives because of the framework Scrapy…
Nowadays more and more DevOps teams are starting to shift towards DevSecOps. The security aspect in Software Engineering is now crucial and fundamental taking into account the world we live in. No longer we can simply rely on Infosec departments to get involved in a later phase and help to improve the system security. It needs to be considered from the get-go by the same people creating and developing the system as a basic element (similar to the infrastructure, CI/CD, etc).
In this context, Threat Modeling sessions are becoming more and more popular among Engineering teams. …
A Quick 15-minutes walkthrough with a Squid Proxy and Docker
It has been a couple of years since I setup an Elastic stack (ELK) to be used for centralized application logging. A lot changed since then and while taking a fresh new look over this, I decided to write this quick walkthrough to share my insights and get you up and running fast.
What is the Elastic (ELK) Stack?
In a nutshell, Elastic (previsouly known as Elastic Search) provides three core projects: Elasticsearch - a search and analytics engine, Logstash - data processing and transformation pipeline and Kibana - web…
You just got your hands into some raw data files (json, csv, etc). What happens now? How do you make sense of it?
You open a console and start using less, grep, jq and other tools. It’s great at start but… complex and hard to do something more than just the basic.
Does this sounds familiar? Great! Keep reading and learn how Splunk can help you out.