Post-AWS Summit Mexico City 2023

Daisy Warren
11 min readSep 12, 2023

--

Welcome to the Post-AWS Summit Mexico City 2023 — the second edition of a 2-Part AWS Summit series where I cover happenings from my experience attending a LATAM AWS Summit. There’s lots to unpack, so let’s begin with a roundup of what this article will cover.

AWS Summit

  • what the summit map looked like
  • my agenda on summit day
  • main sessions, chalk talks, parallel sessions
  • specific notes from each session + pics
  • what I enjoyed about each session
  • how I can incorporate what was talked about to my current workload
  • most central topics: AI + ML
  • overall highlight + feel of the summit
AWS Summit Mexico City 2023 entrance

In Part 1: Pre-AWS Summit Mexico 2023, you’ll find a 4 week round up of resources leading up to the event with:

  • an introduction to AWS & Summits
  • key edu-tainment & skill-building labs through AWS
  • cloud courses & resources to upskill
  • Mexico City 2023 Summit Partners

In Part 2: Post-AWS Summit Mexico City 2023, you’ll learn about key insights I gleaned from my the week’s events as a whole including:

  • personal notes from sessions
  • photos from my CDMX Summit experience
  • key services/tools emphasized + docs
  • what I enjoyed most about each session attended

After months of visioning, working towards, connecting with and applying myself in a more meaningful way, this Mexico City vision came to life. Back in late 2022, I dissected how and where I want to focus my attention in 2023. My engagement with AWS was in that dissected list in a few key ways:

  1. to challenge myself to continually learn to use + implement services
  2. to share tools + tips + resources through a public facing lens
  3. to use my second language to connect with a more niche community within AWS — LATAM
Hello, it’s me, Daisy

Earlier this year, I was accepted and granted access as an AWS Developer Community Builder — a program that offers technical resources, education, networking opportunities for a niche group within the technical community. I have used this access to fuel my route that has guided my engagements this year.

As I mentioned in Part 1, I had never actually gone to an AWS Summit before, and although I’d heard about them and have virtually attended re:Invent since 2021, I wanted a slice of my own engagement in a country where I speak the language and can meet my engagement commitments for the year.

➡ Enter AWS Summit Mexico City, which I’ll also be referring to as CDMX (Ciudad de México).

Live photo from the beginning of my day around 9:40 am local time.

The day began by picking up my badge, roaming around the ground floor expo area, taking the escalators up to the second floor to see where speaker sessions would take place and to get a general sense of where + what time to head to each of the sessions I marked in my agenda.

The first session I attended was held on the ground floor level at the AWS Community Theatre. There were about 50+ seats and standing room around to listen to a Dev Chat on using Amazon SageMaker Autopilot.

“Creating no-code machine learning solutions with Amazon SageMaker Autopilot” Dev Chat
  • This Dev Chat highlighted Amazon’s ML tool SageMaker that enables you (anyone, you don’t need an engineering title to work it) to create Machine Learning models with full visibility — meaning — with autopilot, you define the problem, choose a modeling strategy and/or have Autopilot help you sort out what kind of algorithms to train data sets specific to your case. SageMaker give you full visibility to code it creates behind the scenes.
    > What I enjoyed about this session is how fluid the speaker’s use-case demo was for a beginner-expert level audience.
    > Tool docs: Amazon SageMaker Autopilot

That was the only ground floor session I attended in the day — all others took place on the second floor stages and rooms. After this 30 minute Dev Chat, I made my way up toward the escalators and ended up meeting some of the CDMX training partner reps from a couple of the exhibitors I mentioned in Part 1 (Codster.io, Fastlane & CompuEducación).

As I have mentioned before, AWS does a phenomenal job at providing tools, resources and documentation. It was lovely getting to see how these organizations are using education as a business to fuel upskilling in the workforce and implementing cloud services to drive innovation.

My next session was the main keynote of the day with JC Gutierrez (Director of Solutions Architecture at AWS LATAM) where he gave an overview and brought on guest speakers from companies that are using Amazon Web Services to scale at Kavak (Latam’s first unicorn — auto company selling cars), Aeromexico and others. Think:

  • healthcare: clinics using Amazon Bedrock (Artificial General Intelligence) managing patient information & estimating cures in a serverless database
  • automotive: F1 using ML to run analytics | VW using IoT for intelligent inspection at car plants
  • finance: Affirm using ML to make underwriting decisions | Vanguard using 3+ services to build serverless event-driven applications
  • etc.
Kavak — Buy, finance, manage & sell pre-owned cars.

Take Kavak for example: Latam’s first unicorn company based in Argentina with offices in CDMX and Turkey and operations in 10 countries, “The company to buy, finance, manage and sell pre-owned cars” is leveraging AWS to go from a monolith to a micro-services company.

  • There was a heavy emphasis on using AI to build more intuitive solutions in several industries in LATAM. These industries (and others) are taking advantage of the services like machine learning and serverless computing to minimize ongoing or startup costs.
    > What I loved enjoyed about this keynote was the multi-industry references to how they’ve implemented services that I’ve learned about in different use-cases.
    > Tool docs: Amazon SageMaker ML Governance, Amazon Bedrock — Generative AI, AWS IoT

Another topic that is widely emphasized at this Summit was the concept of governance. One of the 6 pillars of AWS Well-Architected Framework is Security and within it is Governance. As AWS docs mentions,

“Whenever you implement the control, the goal is the same: manage risk… Achieve risk management by following a layered approach to security control objectives…”

The concept spotlights the layered approach to security control through defining policies for who has access and validations for those who are granted access.

Following this keynote, was a session on Storage.

“Optimizing Amazon S3: Better storing, improving & reducing costs.
  • This Chalk Talk was a full hour packed with all things Amazon S3 from S3 Intelligent Tiering (to reduce costs and manage archive-ready data) to S3 Glacier Deep Archive (to retrieve deep archives in a matter of minutes). S3 Intelligent Tiering is a special tier of Amazon’s Simple Storage Service that enables you to optimize usage and cost by archiving media/content that is no longer being accessed/in demand.
    * This is a smart move to predefined in a life cycle of 30 days or less through Intelligent Tiering OR if that content/media is stored on Standard, it can be archived with specific metadata that defines that object as no longer active.

You may be wondering, what is Amazon S3. If you’re unsure what I mean by S3, here’s an article I wrote to break it down into bite-sized bits featured on AWS in Plain English.

S3 storage classes

What is Amazon S3 Intelligent Tiering?

➡ S3 Intelligent Tiering is the only storage class that offers:

  • automatic cost reductions
  • ability to move objects between 3 access tiers
  • the archive tier has access to instances that reduce costs by up to 68%
  • no charges for over usage or archive retrieval fees
  • designed for 99.9% availability and 99.99999999999 durability
Pillars for optimizing storage strategy

The pillars for optimizing storage strategy by use-case

  1. Organizing data
  2. Scaling horizontally
  3. Supervising & automating

Amazon S3 use-cases include (but are not limited to):

  • storing media
  • storing archived content
  • logging data
  • storing data from ML models
  • medical forms and images
  • IoT sensor data
  • storing medical data
  • storing surveillance videos
  • hosting websites
  • storing user data
  • storing autonomous vehicle data

The type of buckets we put our objects into depend on the specific use case. A step ahead of this with Intelligent Tiering considers the versioning. Versioning enables us to put versions on objects and define what events trigger and when through, life cycle policies.
* This metadata can look like being clear about what buckets / objects are being (what is live?) used versus what isn’t (what should be archived).

How can we implement S3 Intelligent Tiering to a workflow?

➡ By creating event-driven applications that trigger a notification (when objects are less frequently accessed) to be moved into archive. This can be achieved in a number of ways including versioning, pre-defining metadata or activating different regions. The key here is to optimize storage strategies by use-case.
> What I enjoyed about this session was how in-depth it was about working smarter with Intelligent Tiering. Part of creating value and ROI using cloud services is not only working to know what you can use to manage your workload, but how to use tools that efficiently manage the workload to maintain stability. I really liked how both speakers asked for audience use-cases to demonstrate routes with specific services to break down WHY and HOW S3 able, flexible and optimal for any industry use-case.
> Tool docs: S3 Intelligent Tiering, S3 Glacier, Amazon EventBridge, S3 Standard, S3 Storage Classes list

The last couple of sessions I attended were parallel sessions in the main hall focused on using AI. Since there were a few stages and sessions happening at the same time, we listened through headphones color coordinated with the session.

Apparently this was not for a silent disco at the Summit.

The one from 2pm-3pm called “Implementing AI projects with Generative AI using Amazon SageMaker Jumpstart” and the other called “Accelerate Application Development using Generative AI” from 3:30pm — 4:40pm were both focused on expanding building possibilities and speeding up development with the help of AI.

Similar to the morning Dev Chat (on SageMaker Autopilot), we were given a demo, but the difference at this one was that there were several speakers who gave different use-case demos using implement AI models with SageMaker Jumpstart. Essentially SageMaker Jumpstart is a machine learning network of foundational models (algorithms, models and ML solutions) that are ready to use, model and deploy quickly.

Amazon SageMaker Jumpstart provides an excellent range of foundation models through AWS Marketplace model subscription. Some of these models on SageMaker Studio include (but are not limited to):

  • text generation (general text generation, multilingual text classification, chatbot interactions, paraphrasing, etc.) with HuggingFace, BloomZ, GPT-Neo
  • image generation (text-to-image) with Stable Diffusion
  • model customization (fine-tuning text generation, etc.) with HuggingFace

2 resources that stood out to me were: AI Use Case Explorer & ML Solutions Library

Find out what’s possible by use case with AI Use Case Explorer
Browse solutions and learn how to implement them with ML Solutions library
  • Both of these sites give an expanded look into what is possible with generative AI using another branch of SageMaker — Jumpstart. With ready-to-use kits by case and full visibility to what’s happening behind the scenes, AWS makes it possible to jump into the world of ML regardless of background.
    > What I really enjoyed about this session was the introduction to a range of ML tools that I haven’t explored. I’ve heard of a few from the Marketplace, but this session gave me a serious insight into what is possible with generative AI. Many of us know that text-to-image, chatbot interactions and forecasts are possible with ML, but this session was packed with a library of solutions that I have been excited about working with.
    > Tool docs: Amazon SageMaker Jumpstart, Jumpstart Foundation Models, HuggingFace, Stable Diffusion, SageMaker Studio, AI Use Case Explorer, ML Solutions Library

My final session of my day’s agenda finished off with “Accelerate Application Development using Generative AI” with an emphasis on Amazon CodeWhisperer — an AI coding companion designed to help developers build applications faster and more securely.

CodeWhisperer as a code companion
  • CodeWhisperer can be integrated into IDEs (VS Code I’m looking at you & more) and supports several languages that make it that much more accessible to a wide spectrum of programmers. With simple comments and existing code, CodeWhisperer can generate code snippets + function suggestions that match context, as well as flag + filter through vulnerabilities + offer suggestions to rectify then and there. Whether you’re a novice programmer getting into it and wanting to dive into learning or a daily programmer, CodeWhisperer positions itself as a code companion meant to integrate into your learning environment or workflow seamlessly.
    > What I loved about this session is how mostly all attendees were software developers hyped about building with AI. It wasn’t so much of using the tool as a copy all, but instead a good sense of comradery and appreciation for how a code companion tool can help speed up very mundane tasks.
    > Tool docs: Amazon Code Whisperer

Finished off the AWS Summit Mexico City 2023 day fueled by 3 cups of coffee, 5 cups of water, a full heart from being inspired & ready for my CDMX wine and dine tour week!

Post-AWS Summit Mexico City 2023 fueled by 3 cups of coffee + 5 cups of water + a full heart

Though there are several highlights I took away from the CDMX Summit, one key point that was spotlighted throughout the day (and throughout this year) has been expanding the range of possibility of what’s already happening as we shift to using AI as a tool. Instead of viewing Artificial Intelligence as a threat to humanity, I believe that it should be seen as a tool to complement the work that human intelligence creates. We are very much in the beginning stages of a massive technological shift, and it’s super exciting to be part of a community that is pushing accessibility to education for it + providing the kits to explore the AI world + continuously striving to innovate!

Over the last couple years as I have been exploring what AWS has to offer, service use-cases, implementing them to my workload and placing myself in a cloud learning container, I’ve recognized the value of immersing myself in places where growth is a byproduct of the environment. I created a personal goal to continuously document my learnings/understandings, learn from my paralleled focus surroundings, reach out and support others in their paths with the ultimate goal of truly immersing myself in growing, learning, building and playing.

Last year I introduced Segments — a structured system of sharing centered around 4 foundational values to nourish neuroplasticity:

growing | learning| building | playing

Visit the Create Rutina publication to read about how I curated Segments & how I now use the structure to drive my day-to-day interactions. I’d say this structure has created a healthy does of accountability, challenge, stimulation and play and perhaps it may for you as well!

--

--