Day 3 #GlobalAI Conference Santa Clara

Greetings from Day 3 of the #GlobalAI Conference. I’m speaking at two sessions today during the conference. Earlier today I went ahead and shared the slide deck and a pre read in mp3 format on LinkedIn. 

Here are the talks I listened to today with a few notes:

Business Track: The Autonomous Pharmacy: Applying AI and ML to Medication Management Across the Care Continuum (Ken Perez) – This was an interesting way to start the day. A lot of this talk focused on targeting and adherence. Both of those targets are about helping people get and sustain care. 

General Keynote Session: The Pros and Cons of Automated Machine Learning in Healthcare (Sanjeev Kumar) – This talk really dug in and tried to address silos and data quality. Those are two things that make it very hard to use dispersed and highly inaccessible data from legacy systems. 

General Keynote Session: Google’s Journey to AI-First  (Chanchal Chatterjee) – 

Technical Track: Case studies in AI for medical devices (Moshe Safran) – 

Day 2 #GlobalAI Conference Santa Clara

Laurent El Ghaoui speaking at a conference

http://www.globalbigdataconference.com/santa-clara/4th-annual-global-artificial-intelligence-conference/schedule-121.html

My notes from yesterday were a little bit unorthodox. A lot of them were just links or a couple words to learn more about. I’m going to head over to the conference here in a little bit. At the end of my Day 2 notes I’m going to just add a little bit of commentary about the talks I attended. 

Day 2: Recap of what I learned and what talks I attended… 

Business Track: Virtualizing ML/AI and data science workloads (Michael Zimmerman) – This talk was very interesting and Michael extolled the virtues of reading the paper listed below.

https://papers.nips.cc/paper/5656-hidden-technical-debt-in-machine-learning-systems.pdf

General Keynote Session: Implicit deep learning and robustness (Laurent El Ghaoui) – This presentation is really solid. When I convert my presentation into a full paper I hope the formulas are as elegantly presented as what Laurent was able to produce. This talk really set the bar for walking and presenting formulas.

General Keynote Session: Pitfalls and panacea: AI and Cybersecurity (Wayne Chung) – It turns out my USB Type-C to HDMI cable works. It got an actual field test during this presentation. This turned out to be a very interesting talk full of security content and timelines. It was highly engaging.

Technical Track:  AI methods for Formal Reasoning (Christian Szegedy) – Talked about moving AI efforts into a direction of true understanding and reasoning. I took a screenshot of a bunch of papers that are going to be added to my reading list.

Technical Track: Challenges in machine learning from model building to deployment at scale (Anupama Joshi) – This talk really dug into the ML software development life cycle and how that is managed.

Workshop: Tensorflow.js : Machine Learning In and Out of the Browser (Brian Sletten) – Talked about using machine learning at the edge. This was a longer several hour session that went into more detail. I did not take a ton of notes. This session was more hands on with the product.

Day 1: Recap of what I listened to during the day… 

Workshop: Building Real World AI Solutions (Alexander Liss & Michael Liu) – This was 4 hours of workshop related to using TensorFlow for machine learning on AWS. 

Technical Track (Finance): Group Theory, Chaos and Financial Time Series (Revant Nayar) – This is one of those presentations that needed a much better projector setup. It was hard to read the notations on the screen. I’m going to see if I can get the deck later to dig into it a little bit more. 

Technical Track (Finance):  Machine Learning In Finance (Chakri Cherukuri) – The visualizations used during this presentation were really top notch. I have been impressed with the team from Bloomberg. 

Technical Track: Image Augmentations for Semantic Segmentation and Object Detection (Vladimir Iglovikov) – This talk was sort of a pitch for using Kaggle and competing in machine learning competitions. I actually wish Vladimir had just leaned into it and really talked about what it takes to compete and how that process worked. The talk really dived into the results and not the mechanics of how those competitions occur. I’m going to spend some time learning about Kaggle on the flight home tomorrow. 

Finance Track : Data in Finance/Banking (Ryan Lee) – This talk could have gone a little bit deeper into the management and use of big data. 

Other topic…

I have decided that getting a paper accepted at NIPS is a noble pursuit. 

https://medium.com/machine-learning-in-practice/nips-accepted-papers-stats-26f124843aa0

Working on some deliverables

Routines have fallen way to a nearly endless stream of things that need to be delivered throughout the last few days. That happens from time to time. Tasks pile up and end up crushing routines. At the end of the day, I spent the last few minutes of the day working on a few things that needed to be done. One of them included putting together a little bit of content for a conference this summer.

DATAx 2020 Conference Topic Area: Machine Learning

Session Description (100 words inclusive of title):

Title: Figuring out applied ML: Building ROI models, repeatable frameworks, and teams to operationalize ML at scale.

Description: Solving the hard problems requires operationalizing ML at scale. Doing that in a definable and repeatable way takes planning and practice. Understanding how to match the deep understanding of subject matter experts to the technical application of ML programs remains a real barrier to applied ML in the workplace. Understanding applied machine learning models with strong potential return on investment strategies helps make delivery a definable and repeatable process.

Well that worked out to a total of 87 words. Maybe I should sit down and write another sentence to flush out the full 100 word quota.

3-Audience Takeaways

  1. Beginning to think about the process of building machine learning ROI models
  2. Setting the foundation for defining repeatable machine learning frameworks
  3. Building teams to operationalize machine learning at scale

Well that is the content I needed to generate before the end of the day. Tomorrow, I need to spend some time working on some new slides. That is going to take a little bit of focus. Some of that content was sketched out the other day by hand. Maybe I should have started with the end product in mind instead of some back of the napkin sketches on this one. That might have helped turn the slides into reality a little bit faster. This approach is really both delaying the final product and maybe improving it. Sometimes you have to produce a couple of drafts of something to get to the finish line. Other times you only need to sit down and write it one time to create the final product.

Preparing something new

This is a recording of my blog from December 13, 2019

My thoughts right now are a bit mixed. Earlier today I started to sit down and write up some thoughts on future topics to speak at conferences. Thinking about future topics is important to do from time to time. Mostly the way I go about starting to build a conference presentation is based on a spark of creativity and a deadline. I’m starting to work on something new for January right now. That is probably something that I need to devote a little bit of time each day to help move the new presentation along. That might take precedence over finishing up a few classes, but it might just help to speed things along by focusing on both.

Thoughts on the #AISummit New York City 2019

This is a recording of my blog from December 12, 2019

New York City is a bustling place full of people doing things. It is hard to say exactly what they are doing, but they are doing it with purpose. The conference center where the AI Summit event was held was really rather astoundingly large. At 5,000 people the AI Summit was the smaller conference in the building. A produce group had 30,000 people in attendance for a really large conference. That conference was bigger than a lot of small towns in America. Preparing to attend an event that was so large took a bit of reading and learning about who was attending. They had special areas for the speakers to rest from the crowds. Initially, I did not think that was going to be important, but it was nice to sit down for a bit and enjoy being out of the crowd. Overall the conference noise levels for the different speaking areas made things difficult to hear from time to time, but the people were great and a lot of them know a ton about artificial intelligence and machine learning. 

Break — Here are some stream of consciousness thoughts about my flight

Two hours and twenty four minutes stand between me and the end of this flight. Right now I just booted up National Lampoon’s Christmas Vacation from the airline movie list. My Pixelbook Go is actually displaying two different browser windows split down the middle vertically. On the left is this word document and on the right is a browser with the movie playing. The one thing that I forgot at the state of this trip was my set of Bose noise canceling headphones. Right now I have a wired headset plugged in utilizing a USB Type-C dongle. This set of Bose sport headphones are not in any way noise canceling. This is the first set of flights I have been on in a long time where the background noise is present. It is a lot louder than I remember.