AI With Friends

EP4: OpenAI’s DevDay, AI-Powered Smart Glasses, OpenAI Record Fundraising and AI Workforce

AI With Friends LLC Season 1 Episode 4

Send us a text

In Episode 4 of AI With Friends, hosts Marlon, and Sekou discuss the latest breakthroughs and discussions in the world of artificial intelligence.

  • OpenAI's DevDay Announcements: We explore the exciting launch of OpenAI’s real-time API, discussing how it enables developers to build voice-to-voice AI applications and what this means for the future of interactive technology.
  • AI-Powered Smart Glasses and Privacy Concerns: The team examines a recent project by two Harvard students who showcased how easily facial recognition can be integrated into smart glasses. We debate the privacy implications and the fine line between innovation and intrusion.
  • OpenAI's Record-Breaking Fundraising: With OpenAI closing a massive $6.6 billion funding round, we analyze the impact on the AI industry, the strategic investments from tech giants like Microsoft, Nvidia, and SoftBank, and what this means for competitors.
  • The Future of AI in the Workforce: We discuss a Harvard survey indicating that while AI won't replace managers, those who harness AI will outperform those who don't. We consider the implications for current and future leaders and the importance of integrating AI into daily workflows.


----------------------------------------------------

Welcome to AI With Friends, your weekly launchpad into the world of Artificial Intelligence. Hosted by Marlon Avery, a pioneer in GenAI innovation, alongside Adrian Green, VP of Engineering at LiveNation, and Sekou Doumbouya, Senior Staff Cloud Systems Engineer at Pinterest, this show is your go-to source for all things AI.

Our hosts bring diverse expertise—from AI strategy and tech innovation to industry leadership. Every week, they break down the latest AI trends, interview top experts, and simplify complex concepts for AI enthusiasts, entrepreneurs, and tech professionals alike.

Marlon, Adrian, and Sekou combine their unique perspectives, whether it’s Marlon’s collaborations with tech giants, Adrian’s leadership in global entertainment engineering, or Sekou’s cloud systems expertise. Together, they make AI insights accessible, actionable, and exciting.

Tune in live on Twitch & YouTube every Wednesday at 9:00 PM ET, or catch us on all major podcast platforms.

Here’s what you’ll get:

Cutting-edge insights from AI leaders
Real-world applications of AI technology
A vibrant community of forward-thinkers

If you're ready to stay ahead of AI trends or spark your next big idea, join us each week for an hour of engaging, thought-provoking content.


Subscribe now and become part of the future of AI with AI With Friends!

Support the show

Follow the Hosts:


Affiliate Links:

hey hey hey hey what's going on what's going on what's going on my friend how you doing doing good doing good over here how about yourself man I'm well as I'm going as well man safe from the uh safe from the hurricane man we had a we had like a little uh scare there man I I know a lot of individuals families homes towns and stuff that impacted um that's also too Where our friend Adrian and stuff is, you know, Adrian is helping some of his family and friends that got impacted by the hurricane, man. So definitely, man, want to give our love and shout out to Adrian. I mean, hope you're doing well. And also, man, I'm just excited for you to be able to, you know, help our people in a time of need. But thank you, man. How was your week? Oh, man. That's been crazy. It's crazy. It's a crazy week in tech, crazy week in work, you know. I have a job that always keeps me busy. You know, I'm an engineer over at Pinterest, a cloud engineer. And, you know, we're always trying to – we're getting toward the end of the year, holiday season. It's a big time for folks. Yeah. Yeah. So, uh, I know they're, they're kind of working and stuff over there, um, as well. So, uh, I know you're doing your due diligence, doing you, doing you justice, taking everything there by day and everything. Max my output, man. So it's going to be, uh, you know, an exciting, uh, end of the year. Um, but yeah, man, uh, my week, man, my week has been good. I've been, um, building and I got, uh, had a couple of workshops and everything coming up. Um, I got a workshop actually tomorrow, tomorrow and Friday with Microsoft. And so our friends over there with Microsoft and so be teaching some AI implementations on top of Azure. And so that'll be interesting. That'd be fun. And so, man, and then just, you know, just been working, you know, building and, you know, finally, you know, we got some things, stuff, man, we've been waiting for. And I am excited and also nervous because the engineer in me may just dish all responsibilities and just build some stuff. It's already happening over here. The games we are playing with... Chat GPT voice mode these days. So man, man, man, man. I mean, we last week we did the ending segment with the with the advanced voice mode and we had an end out. Well, it has the exit and then Jamaican Patois accent. everyone's first thing they do if they make it speak if you make an accent I did the same exact thing it's a challenging accent you know it's challenging to do it properly it's challenging to understand it you know also to when it's been done properly you know as well and so it's kind of like you know it's one of those things it's like you know let's try it out and they're pretty good yeah I think me and my brother, when this thing came out, we took it maybe just one step further because we had a Jamaican Patois accent, but then we wanted it to talk like a pirate at the same time. That was a request by my nephew there. You know what? This is going to be a new thing. We're going to come up with a new... a new accent whatever it may be to do as a close out of the episode and just we're gonna have you know advanced voice mode closes out you know and see see how it does and so that'll be interesting I mean, let's dive into it, man. Uh, opening. I has had some interesting, interesting news and everything. Interesting week. Uh, uh, just yesterday, uh, opening, I announced their big two dogs and twenty four dev day. I think what an interesting things. And so if you remember last year, it was a big marketing push. Uh, And they did their Dev Day live on YouTube. But this year, something quite different. It was private. It was private. It was an invite only. And even right now, they're saying that to get access to it, you kind of got to have kind of requests of access to it. So if you want to see it, but they did give us the news and stuff what they were talking about. So. So one of the most exciting announcements was the launch and for the public beta of their real-time API, the tool that allows software engineers to build apps that provides almost instant voice responses generated by artificial intelligence. It's not quite the same as ChatGPT voice mode, but it is getting there. During a press briefing, OpenAI's chief product officer, Kevin Will, reassured us that the recent departures of figures won't slow them down. OpenAI has focused on providing that they would still have the best platform for AI development, despite the increasing competition with companies like Meta and Google. One of the core features and real-time features of real-time API is the ability to create voice-to-voice conversations, offering developers sick voices to work with, which they can use creating almost real-time speech-based applications. Even be based in to be built applications like trip planners that trip plan to respond voice commands or apps that can talk to customers on the phone by integrating with their services. In addition to this, OK, I also just wrote out some new tools like image fine tuning with GPT-Four, which help developers to build apps that are visual data despite the buzz. you know, there are no new announcements to the GPT store updates and stuff to the Vito Junior Marshall store, but developers also are, you know, waiting. They said, in fact, they're waiting a little bit too long for those applications anyway. Sekou, man, what's your thought? We're, this is a real-time API access and everything for individuals like, you know, yourself and I, you know, what's your thoughts on this? Oh, I think, you know, they're bringing essentially advanced mode to the platform. So we can build apps the same way that OpenAI has done it on their front end. There's a lot of like, this is funny. This is like one of the first use cases we really talked about was using OpenAI even before it had this, even before it actually had voice. I think we were talking about how we can integrate this. And I think it's a pretty big leap. It's a pretty big leap. And I feel like the cost on it is actually pretty compelling to start with. You know, um, I can just imagine, uh, just, it's, it's actually at a, it's at a point one, you can, you can tinker with it without burning your pockets. Uh, you can build it for scale for a customer as an integration without it being the main show in the offering you're creating. Um, So I don't know. It's just a lot of potential. I'm waiting to see what comes out of it. I'm waiting to see what I can do tinkering with it. So yeah, I think there's another thing that actually popped up on this dev day. It's also they're doing prompt caching now. They brought that into the fold here, which I think Anthropic kind of got to that first. So they're kind of playing a little bit of catch up there. Yeah. Uh, which I appreciate it, but real quick, tell me a point that reports itself about prompt caching and what it is. Yeah. So the idea behind prompt caching is whenever there's a prompt that has been seen before, uh, you can actually reuse the output that you got from that to make your output consistent and also save you costs on the Azure requests. Now, OpenAI claims that you would save, a developer would save about fifty percent by using prompt caching, whereas Uh, anthropic can go up to like, I think anthropic says, is what they're saying. That's that their cash hit rate would allow you to save money on. Yeah. Um, so yeah, a little bit catch up to do, but you know, the other things I think are really outshining the, the real time thing. I think those for me is the start of the show. Yeah. Yeah, definitely. I think the, um, I think the caching thing is interesting. One thing they didn't talk about is something that Meta kind of like led the charge with is that they figure out if you retrain your models with the same data, you start to kind of get better outputs. I wonder... if this is a way for them to retrain their model and, you know, kind of like keep it, you know, it's, I mean, keep the cost down and everything, you keep the cost down and stuff. So I wonder, are they kind of like doing something that Meta announced they're doing, what they did with Lama, with Lama three. And I think this is their way kind of kind of quality to the same. And so this is also why they kind of introduced, you know, prompt caching. And so I think it's interesting. Yeah, it's got a whole idea popped in my head. I guess you can kind of get a good positive feedback that a user got the answer that they wanted. If the cache is being utilized and they're continuously hitting it, Yeah, that really helps, I'm sure, with training. You don't have to decipher that as much as you do if any of the other input is connected to that. So, wow, I didn't even think about that off-brand use case for it. Yeah, yeah, yeah. And then also, too, I mean, with the real-time API, like you said, man, this is something that we've been waiting for, we've been excited about, because now this gives... superhero level capability to almost anybody. You have the patience to read through a few documentations, understand the structure of how to build a proper application. I mean, we live in a world now where I would imagine a tour of yoga and bill their own and better version of Siri, you know, and stuff, you know, using this and then, you know, have it, you know, paid access to, you know, and focus on whatever area. And so they, you know, I want to focus on. So, uh, we, me and second was kind of, you know, behind the scenes, it was kind of like a plane. It's kind of playing with the cost and how much these things cost. So for example, the real time API, it says right now the cost on the audio side is a hundred dollars per one million tokens and then two hundred dollars per one million tokens. So we did some math and stuff right here. And so, for example, say, for example, you have an office and you are looking to get help with the phone lines, you know, and let's say right now we'll go we'll scale up high. Let's say you're receiving like five thousand calls a month. Average of those calls will say like ten thousand or ten minutes. And so if you do the math and stuff like that, it's that is. Per month, it's a thirteen million seven hundred and fifty thousand words per month. And so what does that come with? If you are using an A.I. type of phone agent to replace your phone lines or it comes out to around like twenty eight hundred dollars a month, you know, for your you know, for your business that can completely be autonomous and have your your A.I. phone agent. you know, AI system, whatever you want to call it. And so if you don't run your phone lines, you know, for like five thousand calls, you know, a month, which is just wowed. I mean, so twenty hundred dollars on, you know, the API side can seem, you know, expensive. But, you know, where you kind of start to input it to like practical applications. And I hate to use the word jobs, you know, it's it's it's really something that's really exciting. And so, yeah, man, it's it's definitely interesting. Yeah, I think it's a, it's a, I think it's a positive step in the direction of like integrating, uh, AI or augment doing AI augmentation for. You know, for the workforce. Because, you know, there are things, even in a doctor's office, that, you know, are laborsome, things that you kind of don't want to spend your time doing. And also, you want to sometimes give a more richer input rather than press this button to go here, press this button to do this, right? Mm-hmm. So when you think about that, that's a huge, let's see, that actually, one, it will increase the amount of engagement that you can have at these different call centers or like the call center flow and kind of get the, make it so that your customers are more prepped once they actually get a real agent in there. So I think it's a, you know, I'm interested to see where it goes from here and what gets built from it. I mean, I think with, with the real time API, man, I can think of so many use cases and the one I'm, I'm probably excited for the most. And the one I'm probably going to spend too much time on, you know, building and stuff myself is like the AI therapist. You know, this is a, this is now a, huge step forward of not only that becoming a reality but it also you know being able to you know you can spend something that probably tonight um you know stuff with it and so having those type of things um and have it interact with you know real base scenarios that can have life-changing experiences. So now being able to store personal deep information inside a vector database and being able to retrieve that information while also too, you're having your AI therapist help you through grief, help you through lots of job, help you through marriage counseling sessions. And, you know, now when you start to integrate something like a RAC application where you can start to include, you know, different researches, you know, books, stories and things like that to kind of like help, you know, put a spin to it. Like, what are you looking for? I mean, this becomes really, really interesting. Yeah. Yeah, it's huge. It's huge. So yeah, I'm looking forward to what we end up building from these things. I always have the work I'm doing on this, trying to create the second brain for myself to store all my thoughts and make it easy for me to recall things from the past. Uh, well, this is, this is like, cause right now the way that the input goes is, you know, I talk to it, it records it, it sends it to open AI and I get a response back in text with some interesting things inside of it. But now I have a whole nother step that I can add inside of here, which is. having it actually give me real-time feedback, rather than removing the voice recorder from the situation and making it an interactive call that happens in a private way for me. So I think that's super, super interesting. And what they're You know, there are plans that you have, the teams where you can set it to not store or train off of your data. I think that's super, that's huge. So this is definitely, you know, it's something that happens on dev day, but I feel like this is such a huge thing that like, hope more folks have seen what's happening here. You know, One of the things we didn't talk about, which I'm really curious about, why not do this public? Why does this become a private event? Oh, yes, please talk to me. Oh, yeah, I know why. Because they would have been flooded. This thing is so popular, people would have flown across the United States, Marlon, to... To arrive at their door. Yes. Unless you're a friend of the board or some type of partner or something like that. And it also makes those folks feel like they're part of an exclusive club. I don't like it. Well, in a way, that's how they pick the winners and losers. The people who are over there at that at that exclusive conference now have a leg up on everyone else there, everyone else outside. I don't like it. Open the eye. If you're listening, I have been a user, a builder since, uh, GBT two, two point five. Um, and so loyal very loyal yes yes very much so I have taught uh to this point fifty plus ai workshops um most of that is also you know teaching how to implement and integrate you know ai tools from your chat mbt so your api site And I've done, you know, with that panel discussions, keynotes, podcasts. I'll talk about this area. Damn it. I want my invite. I want my invite. It should be a sponsor. There you go. Yes, to make it up to us because we were devastated over here. Yes, become a sponsor because literally, I got the email. I got an email saying OpenAI Dev Day and I'm like, okay, they just announced it and no, they just did it. Yeah, I was hurt. I was hurt. This is the Silicon Valley way Speaking of Silicon Valley, Silicon Valley is also known too for their capital and their loads of money over there. So watch this. Opening our heads, just announce and close the biggest venture capital round in history, raising a massive six point six billion dollars. I said this brings their total valuation to a whopping one hundred and fifty seven billion. The round was led by Thrive Capital, which put in one point three and big names like Microsoft, Nvidia, SoftBank is also interesting to as well contribute significant amounts. Is that the new funding will allow API to continue to push their limits on AI technology like expanding their compute power and hiring top talent to stay ahead of the competition. OpenAI has announced, OpenAI has already spent billions training their models with that GPT-Ford model alone costing over a hundred million dollars. What's also interesting in that some investors have agreed not to back OpenAI's competitors like Anthropic. Elon Musk's XAI in exchange for being a part of that record-breaking round. This is huge. Cash Infusion that would help OpenAI maintain its dominance in the AI space where competition is heating up from companies like Google, Amazon, and Metal. And a key note here, it was also rumored that Apple was going to be one of those investors and Apple decided to pull out at the last minute. So, Yeah. There's a lot in there. There's a lot in there. And all you know, there's still not enough. Anywhere close to what they need to do. Microsoft is looking at building nuclear power plants. So I always, and I feel like this is such a, this is also like we said, this is a Silicon Valley thing, right? You have all these companies that are investing in this technology and When you look at the folks that are investing in it, they're, like, paying themselves. Like, if Microsoft's putting money into OpenAI so that OpenAI could use more of their services, those services are going to use NVIDIA. NVIDIA is putting money in. Like, it's a circle. It's a circle. So it definitely looks huge, but, like, Yeah, Microsoft just gave themselves a raise somehow in this process, because I'm sure they're making more money off that one billion, the portion of the one billion that they put in. Yeah. Yeah. Yeah. They're making more money off that one billion than what you're seeing from this year. So, like, it's a wash. Actually, it's probably just pure profit for them at this point in time. Yeah. I think, I mean, this is the one we've. So, you know, I used to work in venture capital. And I was an engineering director at the largest minority-owned venture capital firm based out of Midwest. And so, you know, I got to like, see how that world works. I got to see how that world moves and, you know, um, you know, just to kind of like the whole process. And I mean, this is, this is impactful. I think one of the interesting things too, I didn't, I didn't know SoftBank was going to get involved. I mean, they have the funds, you know, to do so stuff for sure. And so, and maybe it's like companies like it, maybe it's, it's, you know, firms and institutions like SoftBank that can kind of, you know, typically afford this as well. And so, but now I think we're about to see, I can guarantee you we're about to see some of the hardware side of these things as well. Not like the, not only the compute side, but like, you know, hardware devices and integrated devices, you know, that also has, you know, OpenAI and different models of embedded into it. And so remember too, Johnny Ivy, you know, announced like I think last year that he's going to work, he's working on a hardware device and partner, you know, with OpenAI and we have no clue. yourself what that is. But I imagine, you know, because typically that's, you know, one of the most expensive industries to build in on hardware. So I imagine that this will kind of like give us some energy and some speed to as well. And also, too, I think what was interesting, having the investors agree not to back opening eyes competitors. Yeah. Very interesting. That's that's pretty that's pretty strategic, pretty smart on their behalf to have that in there. You're definitely picking winners and losers in this process. The Apple move is actually probably the most surprising piece that I think I've seen. Because with Apple, the fact that they are deciding not to put money into this and be part of this process is they're pretty saying they're open to different models. They're open to creating their own, you know, creating their own thing and that all the, you know, devices that they have out there are not just going to be open for just use for open AIs, future development. So that's, I think that's, that's a, it's a pretty huge statement, I think on Apple's part. So, Yeah, for sure. I was, you know, it's funny. Adrian and I talked about this last week where, you know, the competition landscape of things. And so it seems like a lot of the big companies are partnering instead of just kind of like focusing on building their own. And, you know, Apple is one of those institutions where not only can they attract the top talent to build, but they would have the infrastructure and the access to build a strong competitor that lives in the Apple universe. I don't see them necessarily building their own ChatGPT or anything like that, but creating a voice model that will live in the Apple universe, I can definitely see them you know, building something like that just because, you know, it benefits, you know, the brand and stuff so much. And so I, I definitely agree. I think, I think them deciding not to, you know, invest, but also too, they, they did, they're still, they still have the Apple intelligence, which is powered by, you know, powered by OpenAI. And so, you know, I, and I, and unless I think this is, this is kind of like the Apple way as well, because typically Apple, in the early segment or something, Apple typically partners with, you know, companies and then eventually they go build their own. And so, you know, that's typically how they work with things. And so I wasn't necessarily surprised that they partnered with nobody. I, you know, to power, you know, the Apple intelligence and, you know, Siri two point. Oh, where do you want to call it? But I also think this is an indication that they're going to go back to doing it the Apple way and start to build their own, you know, voice models and stuff where it may be. Yeah, it's, you know, this reminds me of, so I'll go back to my, you know, my history, pressure trove here, of the world there where, you know, the cloud providers were just coming online. You had AWS, you had GCP, you had IBM, you had... was it Samsung had their own cloud at some point? And everyone was like, you know what? We're going to build, I'm a company and I'm building things as a cloud. I'm going to make it multi-cloud. I want my service to work off of all different clouds that are out there and treat them all the same. And what I think we've all learned through that process is that there's a cost of not doing deep integration and And that cost of not doing deep integration usually outweighs how much you would lose by not taking on the idea of working on all platforms. So I think, I don't know, it is kind of a gambit, I think, that Apple is playing. But the thing is, I think for Apple, it's hard to look at any company as an equal partner when you're Apple. It's really hard to do that, to say that we're going to give us equal leverage here. I don't think it knows how to operate in that way, personally. It's really its own thing, and everything else is kind of bolted onto it. But equal partners in some way? Nah, it won't do that. Right now, Nvidia and OpenAI are pretty much equal partners in some, some regards because the amount of like a buzz that open the eye creates makes people more enthusiastic about using their product. Right. Yeah. And it's, I'm going to match you on the history part. Um, the apple two days apple was building they were really great at the hardware and they became really bad on the software side as they started to go you know and so this is also why they're quiet they're quiet mix uh which was um you know steve jobs um startup after he departed from, you know, from Apple. So they acquired Nex. And so they also, then they started getting better at the software side. Kind of same thing too with even, you know, even the iPhone. A lot of the hardware at one point was built by other companies and built, you know, typically outside of the U.S. into as well. And they were running the software side and eventually brought that in-house as well. a lot of the accessories and stuff for Apple was kind of built externally and then eventually kind of built, you know, brought it internally to as well. Um, remember at one point to, um, the web browser, you remember how the web browser, uh, required flash. Oh yeah. Yeah. Yeah. And so, you know, web browser required flash. And I mean, it was like a Steve jobs was pissed, um, and stuff for it. And so he eventually, he was just like, uh, uh, I remember watching and going through their process that they integrated Flash for six to eight months, and then they built their own version and stuff of it. And probably a year after that, the world browser didn't need Flash and stuff anymore. And so Apple just had this continuous process of uh they partner so you know when they feel like it's necessary but their long-term roadmap is to eventually build their own and bring it back in-house and so I don't see it be any different and stuff you know from here on out you know as well but at this point right now as of today it's extremely difficult to compete with what opening eye has done and built um and so there may be more you know it may be more benefit and stuff over there to partner um even for the long term but again it's apple I mean they can figure it out yeah oh man just made me think about removing flash I remember when flash came out there was like this period of time flash was end of life it took like ten years to remove Flash off of systems. It was such a sticky piece of technology. It was just ingrained into most of the web. It was so crazy. yeah that's that's a throwback and stuff for sure just like you had it's literally you would be scrolling down you know a browser and then you have like this big square like this gray square with a red icon which is like the flash logo and it's like you have to click that then it loaded you know whatever like multimedia that you were trying to view watch see um and um and so apple was like yeah enough of that we're not doing that anymore Actually, is it? Hold on. When did Apple create Safari? When did they create Safari? Oh, my God. I think it was around that time for that reason. Huh. Maybe. Maybe. I'm going to do some more research on that. And we'll talk about it next week. But I'm almost sure one of the reasons why they created Safari because Steve Jobs hated the Flash experience so much. Yeah. I'm almost sure. So just real context and everything. So I have a friend who worked at Apple and worked at Apple while Steve Jobs was there. And he was in California. So he has all these stories of just like his mind and just how it worked. So Steve, what they were doing, what they would do, like their demo day and everything, all the products and stuff. Steve hated that the exit sign in the back of an auditorium, when they turn off all the lights, will still remain lit. And he was like, no. I want a full blackout experience. And they said, well... We can't, by law, turn it off, you know, or we can get in trouble. So he said, okay, what's the definition of trouble? He said, we'll get fined and stuff. And so the fire chief heard about it and came in and started fining Steve Jobs and started fining Apple for turning off the exit sign, you know, turning off the exit sign because he wanted to pull a blackout. Steve started sending the fines ahead of time. He started sending the fines ahead. And he just didn't care. And so they were trying to make it a big thing and trying to figure out a way how to stop him from doing it. And he was like, yeah, this is not going to happen. We want a full blackout. And so just off that one thing, you know, and kind of like how his mind works as a menace, I'm almost confident. One of the reasons why they built Safari because of the flash experience and he hated it so much. And at a time you needed it. And I'm almost confident they built flash. I'm going to say they built Safari because of that reason. I'm gonna do some more research. Oh yeah. Yeah. You gotta, you have to get the bottom of this one. Yeah. Oh man. All right. So here, uh, Moving forward and stuff here, we got something that's, you know, it look concerning and stuff here. Two Harvard students have shown just how easy it is to use facial recognition technology with smart glasses to identify strangers in real time. They built a demo using the Ray-Ban metal smart glasses that can take people's face by recognizing them and pulling up their names, addresses, phone numbers, and from public databases. They said this whole process can happen instantly using technology and is all really widely available. The students say they didn't build this for misuse, but to raise awareness about how easily this tech can be invaded. Privacy. This project demonstrates that the future where we could recognize by strangers on the street is already here. Yeah. So, yeah, these metal rebands. You know, when they first came out, I was like, oh, these are kind of cool. I can use a little voice assistant. Yeah. It has a weird camera on it. I don't think I ever used that, but I like the idea of listening to music and a voice assistant on there. I feel like we got to exactly where I expected us to get. The fact that you put cameras where folks' eyes are. There are going to be folks doing creepy things with the glasses. There's no... I'm sure you remember the Google Glass. We always refer back to this thing. I was going to bring that up. What do they call them? The glass holes. Which... Yeah, I think... It's a second coming. So I think you're going to see more places where they're going to say, hey, you can't bring those in the movies. You can't bring these into public spaces. You can't... Hopefully Meta kind of gets in front of this before they have to sit in front of Congress. Again. Again. And then, of course, not long after this article broke, within two hours, you learn that Meta confirmed that it trains its AI based off of any image you ask Meta Reban to analyze. So if you're analyzing, say you're taking a picture of something. Your daughter. Your daughter. And you know what they do? Once they analyze it, they said that they are going to use it in ads. They're going to put your face or your daughter in an ad to get back to you. This is one, a hundred percent creepo, like tech, like stop it, stop it. Like you're making everyone look bad. Yeah. So, Yeah. Yeah. I think, you know, it seems like that's the students, you know, true intentions is to like bring awareness and stuff to this. When I read this, my mind started going and I was like, oh yeah, I know how to build that. I was like, yeah, I know how to build that. And so that's definitely... And it's funny, I didn't necessarily think about that use case, but going to what you just said, Sekou, this is also one of the reasons where it was like the big pullback from the Google Glass is they were concerned with privacy. And so I think two things here is... I believe we live in a world that only cares about privacy when it affects them directly. As much as people kind of scream at the top of their lungs or things like that, I just don't think that typically people care about privacy as much. I mean, we still live in a country where ABC, one, two, three is the most used password. Um, you know, and so that's not, you know, caring about privacy or trying to protect yourself. You're just looking for ease of use, you know, and people have built companies and industries of ease of use. And so I think it just kind of comes back to, I think it's a good job for awareness. Um, typically who we are as a country, we always build a car before we build a seatbelt. And so I can see Meta. I would hope Meta get in front of this. I mean, they've had enough, you know, bad publicity, you know, of just things that are done, you know, things they've manipulated and, you know, and all different, you know, sectors of, you know, just like their, their work. And so, you know, but, but this is definitely that, I mean, that's when I, when I saw it and I read it and I knew for sure that I can build it myself, then I know tons of other people can figure it out. And so, you know, this is definitely, you know, can be a problem, but I definitely agree. I think we, we probably will start to see. And unfortunately it, I think it won't become widely an issue until something unfortunate happens that connects back to the foundation of somebody finding somebody, doing something. It's like that. Number one, I take my hat off and everything to the two students over at Harvard to bring awareness to this. Just like any other... advance a new technology, people are going to use it for good, people are going to use it for bad. Typically, the technology itself, it doesn't necessarily is the evil part of it. It's always human intention. People can take this and do unfortunate things with their human intention. I think it's something that we're hearing. We definitely got to be mindful and stuff of. I mean, the question is, what happens now? Yeah. It's so interesting because it is very similar to the technology of the Google Glass. The only difference is that Google Glass did not look cool. That's the difference between these two things. So Facebook successfully made something look good. It didn't look seamless. It didn't look seamless. It definitely looked like you had a little robot thing with your head. It was obvious. I also think this is one of the issues too. I think the Google Glass, it was obvious. Am I making this up? Did it have an indicator that it was recording on it too as well? It had something like that. It had some type of indicator. It would light up. like that for you. But she didn't have a camera. Yeah, I did have a camera on it. Yeah. So I don't know. They've Facebook is a company that really got everyone to shed their privacy. Yeah. And allow them to commoditize that. I'm not surprised that they are making a very successful push at making us lose more of our own, lose more of our privacy and giving it to them for free. But like I say, this time it's not for free because we're buying glasses and For them to take our privacy, that's a little different than getting a free service and paying for things in ads. Yeah, it's very similar to what some people do, even with ChatGPT. You know, I've I've I found it shocking that I've had to communicate to people to not to put in your social phone number birthday, you know, into these models. And so, again, going back that, you know, I just don't believe that. the vast majority cares about privacy. It only cares about privacy when it happens to them and stuff. And so it's just it's and like I said, we're doing the same thing. So we're touching BT. And so, you know, we're paying to, you know, give away, you know, our our data to some degree. And even with this, you know, I've I've been an advocate of, you know, use the tool, but also to understand when it's time to know actually do things yourself when you're you know potentially kind of like tapping into like your own your own genius um because it's going to learn and it's going to get better you know and with that and there's benefits and there's pros and cons and stuff you know with that um and I think that's one of the reasons why I was excited to see a gpt store because then you can kind of allocate your brilliance into a particular uh you know framework that you can monetize stuff from and as of today you know we There is no monetization strategy from GPTs that live in GPT store. There's ways to monetize with it. And so, yeah, I think it's interesting to see what's going to happen with that. Yeah. Awesome. All right. All right. So also, too, you know, man, Sekou, you know, you and I, you know, we've definitely spent our first year, you know, within the workforce, you know, for a time, you know, you longer to me, you know, you have a bunch of my knowledge and stuff in the workforce and, you know, wisdom stuff than I do. But, you know, the future of AI, you know, the workforce is definitely going to be interesting and stuff here. Harvard did a paper, did an article and stuff here. They're saying going around AI won't replace managers, but managers who use AI will replace those who don't. While that might be true, we're not quite there yet. A recent survey found that only fifteen percent of leaders are consistently using AI in their daily work. However, the next generation of managers is already embracing generative AI. Many MBA students are using AI tools every day, not just a simple task like writing or summarizing, but also brainstorming and problem solving. As these students enter the workforce, businesses will need to adapt to keep up with the tech savvy ways. Companies that want to attract and keep these talents will need to demonstrate their commitment to AI, whether that's giving new hires access to the latest tools or providing hands-on projects where they can experiment with AI. Sekou, man, thoughts? Yeah, I think this is interesting. I definitely see maybe the first generation of AI that we're talking about, maybe pre-AGI world. Maybe we say it that way. Pre-AGI world, I can imagine that having AI skills, being able to augment your work, and produce things in the speed of someone who's using AI is going to be beneficial to have. And I think as the AIs get better and better, I don't see it not just being a requirement, but I imagine there will be portions that will get to the point where we start really outsourcing. We're going to outsource a function of management to AI. Right. Whether it's going through processing reviews for individuals, looking at success metrics for projects, I definitely see a place in how those things kind of intertwine. Today, I don't think I've seen many postings that have, like I said, requires AI skills or experience with Copilot. I think that is what I'm kind of looking for. That's my indicator that this is really real. Right now, it feels like, you know, I can't. I feel like, you know, we have to see more to see if we've gotten to that first step yet. More indicators. But It's changing so much. If you teach someone how to use JNI in their workflow, outside of just chat GPT, like next year is going to look completely different than this year. Last year looks different than this, than where we are right now. Right. Right. Like it's moving too fast. I just, I expect to like actually integrate it into like, like a master of businesses coursework, right? Maybe just general AI concepts, but like how to effectively use it. I don't know one organization that like has like a true effective dev experience out there for using AI. They're building it. It's getting better, but like, No one is, like, one hundred percent relying on that. So it's hard to, like, how do you teach something that doesn't exist yet? It has to exist first, then it teaches. This actually reminds me of when I started doing systems administration. Like, systems administration was just something you just kind of just learn by just being there and absorbing information. like what was going on at, at different businesses and operating systems. It didn't start until like years and years after where they could actually like distill what it was to something that could actually fit into coursework. So I think there's going to be a lag. We need to get, we need to go further before we can teach behind us. So it's not, it's not expectation for that to happen super soon, but it's going to come. It's going to happen. Um, but I did question a little bit of the merits of the, of this particular article. Yeah, I think, um, I think the, you know, the future, the future of AI workforce is all based on the speed of education. Um, you have to teach um and you have to communicate to leadership you know first you know things to also for them to see what the ri itself is going to be and then once that they can accept our or I and understand what it's going to be did they make the investment and then the implications of teaching you know via workshops webinars you know conferences whatever it may be I think that process you know starts um but with that process that scale takes a massive amount of time Because also too, you're going to have, you know, organizations, institutions that prefer to use, you know, Microsoft Copilot over, you know, ChatGPT, you know, or opening iOS tool. And then another institution is going to want to use Anthropic, you know, over, you know, something else. And so then you're going to have institutions that can build their own and stuff versions and stuff of it. But also too, they need somebody to still come in and teach, you know, to as well. And so, you know, we're talking about, that's definitely going to take some time and, you know, be a process. And, um, when it comes to the workforce side of things, um, I definitely think, you know, companies like opening on Microsoft or, you know, kind of like, you know, leading the way, um, I think Microsoft copilot, um, I think they're doing, I think they're headed in the right direction. My concern with how they're building and moving with things is their speed of their, products that are building and integrations that they're building. And so, you know, like we just got an email. What was it? What was it? We just got an email that Microsoft copilot now can see your screen and, you know, respond back in real time. And so, which, which sounds awesome, but was it, what would you do with that? You know, what that reminds me of, it's like, that reminds me of Android building all these features that nobody uses in three months. You know, they're building all these different things where your phone can do this and they can do this and they can zoom all the way into the moon. Like what's the practical use, like the everyday use of that? Like that's just, you're not using those types of features and you're spending an ungodly amount of time on research and product development and all different things like that. And so it's just like, I understand why you did the bill and I understand the speed of what you're going, but also too, you're skipping over the main part of this whole universe need, which is education. you know, and you're gonna have to educate the masses. You know, I will be extremely interested to see, I mean, this right here just said that, you know, only fifteen percent of leaders are consistently using AI in their daily work. And so, you know, what does the entire workforce looks like? I mean, do you think we're at a thirty percent, forty percent, you know, integration, you know, inside, you know, daily work and stuff like that? You know, I don't I don't think so and stuff as of right now. And so, Because everything is still spread out. And so being able to just create a PowerPoint using AI, there's seventy five different ways to do that. And so for right now and then when you start to integrate things like images using Canva. Me personally, I just haven't had a good experience with the image generation tools and stuff that Canva are promoting. They're saying that you can do seamlessly. It's building. It's still figuring out. Hallucination is still very much at play with all these tools and processes and things like that. I think we eventually will live in a... an AI focused workforce. I think the the spin and the money behind education has to be the main vehicle that takes us there. And as of right now, I just don't see, you know, that happening. I just see right now everybody is figuring out on their own, learning on their own, you know, which is fine, you know, and stuff like that. And so which is fine. But, you know, when it comes back to that being a reflection of the RRI that can, you know, cost you dollars, speed, you know, output, efficiency. You know, I don't see organizations and institutions kind of like, you know, spending the time or stuff right now. Yeah. I think, I think actually was probably the more of the reality. Oh yeah. I think we have a comment here. Yeah. AGI between twenty twenty seven to twenty twenty nine. Yeah, that's definitely what they say. But I think this interim state that we're in, I feel like we're going to be there for a little bit. Even once we get to AGI, there's still this little bit of a lagging time. Businesses adopt technology at such different rates. If you're in the tech business, we're pretty fast. Enterprises are okay. You know, Um, they're not lightning speed eating. You get to, you know, your companies that use tech to support their business, but they're not tech first. They're actually slower, right? Then you get the healthcare, which is like when you're dealing with lives and technology, really, really slow. And then government. Government is the slowest moving thing out there. That's like fifteen percent of the workforce, you know, like right there. So there's still a good amount of this interim time that we'll be talking about here. But I think what's happening is there's still right now, this is the golden era of early adopters. The early adopters that are on in this journey right now are going to be the folks that are just silently winning they're not vocal about how fast they are being made by using these different tools and just you know looking like they're just suddenly became an amazing engineer behind the scene I think that's like I think that's where we're at right now we're gonna be there for for a little bit Um, cause you know, it just, it, it doesn't want to take time for some of these things to like be digested by industry, but education actually, I don't know if you think education would be like the forefront of all these different things. They are actually right between, um, um, the government. and medical oddly enough as far as speed to adopt things so unless you're going to a top tier school maybe mit or something like that you have a better chance but the rest of the places no they're just you know yeah I I spoke at the um the national black nba association conference um last week week before last um and it was interesting to like kind of scan the room on like most people like these are individuals in the workforce and there's there's not like entrepreneurs and stuff you know in the audience you know these people are stuff in the workforce and everything from different sectors from tech to government and so it's very interesting to like engage and see and basically how they're thinking and integrating and stuff and using you know ai and it's the the main thing I got got from it is they still don't know like they're they're doing things and they're and they're like you know they're they're figuring out they're working on it it's like they're using you know your tools like chapter bt um but you very much can tell everybody's still figuring out you know on the work workforce aspect And it goes back to the lack of education, you know, from, you know, I don't, I wouldn't necessarily point a finger, but it's like the lack of education, you know, on abroad and stuff of it. And so, you know, until you can have, and also to education, going back to what you said, Sekou. Mm-hmm. education is going to have to advance in this area because you're going to have level one education where you can just come in and you're teaching how to prompt how to use how to email and then you may partner with the engineering department to build rag applications that you know what you do with like looking for documentations email and sales reports and things like that. But then for proper efficiency, you're going to have to have education per department. And you're going to get a little bit deeper in on stuff there. So the conversation where you're going to have with your legal team is going to be extremely different what you're going to have with your sales team. and you know how do you maximize and build efficiency you know there like the tool the technology itself it's capable you know of doing so um you know but you know how do you do that you know do you just work for do you just wait for you get a champion internally um to kind of come in and just kind of teach the entire team and show everybody how to do it um I mean you've been waiting for a long time you know so you know you know do so you know or do you go do some type of hiring, partnering, consultation and hiring someone like Marlon Avery to come in and teach your organization how to do these things. Shameless plug. And so, yeah, yeah, it's going to be it's going to be interesting. And so I yeah, I've done a lot of these like, you know, these workshops and, you know, organizations. And I'm happy you said that, too, as well. And I think about it. I haven't done not one. So like in the last sixteen months, I've done six sixty six engagements around, you know, from everything from workshops to conferences and discussions. And so. And I think about I haven't had a focused organization that wanted me to teach a particular team. You know, it's always been from like a bird's eye view. And then typically from there, the education doesn't move forward. The application side of things sometimes move forward and how they want to build and integrate, you know, there. But the education aspect doesn't go. It doesn't become focus. It's still it's still, you know, kind of why. And when I say focus, I mean, focus on a particular team. Um, you know, like your sales team and, you know, or your, your, um, your marketing team and stuff like that. So it's interesting. Yeah, definitely. Definitely. Yep. Uh, I think we're, I think, um, I'm very excited to see what the integration store is going to end up looking like for some of this stuff. Um, the education side, I'm hoping. Maybe this time it won't be the same, but just going from history and what we've seen, these are the trends that usually happen. We always build a car before we build a seatbelt. Exactly. And so, uh, so yeah, we'll see. We'll see what's going to happen. So for sure. Um, say cool man, give us, let's go. Let's give us a, give us a accent or give us something that is going to close out here. We're going to ask the advanced voice to do a podcast. Close up. Say cool man. What should you give us something? What should we do here? Hmm. You know what? Last week we did, to close out, we did a Jamaican Patois accent. Which one do we do this week? How about we mix a little bit of a Scottish accent with a light stutter. How about that? Let's do it. That is awesome. Okay. All right. Here we go. So I'm going to gear up the, I'm in my chat to be T application. I'm going to gear up advanced voice. You know, it's interesting too, as well. We kind of talked about this earlier. Has there been, has there been any conversation on how advanced voice works better per model? You know what? So I've tried it with O one preview and I don't think it is working the same way that I expect. Like it definitely, if I type it in, it is going through its tree of knowledge, like process of like thinking about each component of what I've explored and then give me answers. Yeah. but it answers me back way too fast while I'm using the voice. So I'm like, it's clearly doing something different here. So we're going to stick with GPT for Omni. All right. So here we go. You said Scottish accent with a stutter, correct? With stutter. Yeah. Okay, cool. All right, here we go. Hey, how you doing? Can you give us, can you create a podcast outro? Today, the podcast is AI with friends. This is episode four. And today we talked about opening eyes dev day with the new real time API. We also talked about opening eyes record-breaking fundraising um that they they raise um we talked about ai and fashion recognition you know is it a scary reality and then we also talked about um the future of ai in the workforce um the host of the podcast is marlon seku and adrian and I want you to give the outro in a scottish accent with a slight stutter into this episode of AI in the Workforce. Your hosts, Marlon, Sekou, and Adrian are grateful for your time. Stay tuned for more enlightening conversations. And don't forget to subscribe. Cheers. Awesome. Okay, so I want you to redo the outro, but increase your stuttering as you're talking. Thanks for tuning in to this episode of A.I. with Friends. We've had a fantastic chat. Thank you. OpenAI's Dev Day Fundraiser, AI's role in facial recognition and the future of AI in the workforce. Your hosts, Marlon, Sikiu and Adrian are grateful for your time. Stay tuned for more enlightening conversations and don't forget to subscribe. Awesome. All right. Lastly, give me a tie between in between a British accent and a Jamaican Patois accent with no stutter for the outro. It's gonna be interesting. for tuning in to AI with Friends. We had a brilliant discussion about OpenAI's record-breaking Dev Day fundraiser, AI's role in facial recognition, and the future of AI in the workforce. Your hosts, Marlon, Sekou, and Adrian appreciate your time. Stay tuned for more insightful conversations. And don't forget, subscribe. Bless up. Bless up. Bless up. Oh, man. That's my thing. We're going to start getting some suggestions here from the from the audience and letting us know. So that's awesome. All right. Well, definitely, man. I mean, it's been a great one, man. You can follow me. You can find me on all platforms. I am Marlon Avery. And so feel free, man, to engage. Send me a message and give me any questions. Sekou, man, where can people find you? Hey, you can find me on Twitch and you can find me on TikTok at SekouTheWiseOne. And it's the number one. It's the number one. It's the number one. The number one. And also, too, if you listen to this on our podcast, you can also, too, find it in the show notes, too, as well. And so, man, definitely, man, guys, appreciate you. Definitely subscribe, comment, like, and stuff here. And let us know what you think. Until next time, man, we are out. All right. Peace.

People on this episode