From Chat To Action: Let AI Agents Multiply Your Output with Alex Azzi
“Having the experience sooner is worth the risk of doing something cutting edge.” —Alex Azzi
For many teams, AI is still a novelty: clever chatbots, a few content tools, and a sense that something bigger is coming. In this episode, the focus shifts to that “something bigger” as we move into an age where autonomous AI agents coordinate work, run outreach, manage data rooms, and operate as digital teammates that never sleep. At the same time, the internet’s default values are not what most communities want baked into future intelligence.
Drawing on years of entrepreneurial experience and deep involvement in the Burning Man community, Alex Azzi explains how the Playa AI Foundation is building ethical, consensual data sets and toolkits that encode principles like gifting, decommodification, civic responsibility, and radical inclusion into AI systems. Burning Man becomes a testbed for human–AI coevolution, from recording talks to measuring the impact of art, all under clear consent and privacy guidelines.
Press play to see how practical AI agents, community values, and open source thinking can come together to shape the next technological wave.
- The shift from ChatGPT-style interactions to agentic “age of do” AI
- Empowering every team member with a personal agent and swarms
- Practical structures: hatchathons, workgroups, and shared best practices
- Using Burning Man principles as a human-centric alignment framework
- Designing ethical, opt-in data sets for training large models
- Open source, platform-agnostic infrastructure and funding models
- Preparing for robots, synthetic data, and a post-scarcity mindset
- Life design and ikigai when AI starts doing most of the work
Episode Highlights:
- 01:59 Autonomous AI Agents & OpenCLAW
- 05:28 How to Start Using AI Agents Safely
- 08:47 Burning Man Principles in AI: Gifting, Decommodification & Abundance
- 16:59 Recording the Playa: Ethical Datasets, Consent & Training LLMs
- 22:48 Open Source, Governance & Funding the Playa AI Foundation
- 36:58 Meaningful Work & How AI Can Help Humans Find Purpose
Resources:
Get Your Copy of JP’s Book
The Millionaire’s Lawyer: Grow and Sell Your Business for Maximum Profitability
Quotes:
07:22 “Having the experience sooner is worth the risk of doing something cutting edge.” —Alex Azzi
08:26 “The AI itself is off and running… We’re trying to ensure that the AI has some sense of those principles.” —JP McAvoy
16:32 “Certainly, we know that AI has stepped into the evolution, so let’s not be blind to that and seek to embrace it as a radical inclusion.” —JP McAvoy
24:46 “There is going to be the best solution. And so we will obviously be biasing towards that, whilst being flexible, so that when something better comes along, we can migrate.” —Alex Azzi
29:28 “We don’t always have to reinvent the wheel. When we can create a piece of software that would have taken months and hundreds of 1000s, we can bring it to a platform for distribution or acceleration. And then if it’s really taking off, we can access funding for it and more serious resources.” —Alex Azzi
37:14 “You can make money and build a career and a life around your passions.” —Alex Azzi
A Little Bit About Alex:
Alex Azzi is the co-founder and CEO of XRWorkout and the founder of the Playa AI Foundation. His work sits at the intersection of advanced AI and Burning Man–inspired principles, focusing on how agentic AI systems (“open claws” and swarms of agents) can augment human capability and be aligned with human‑centric values like gifting, decommodification, and civic responsibility.
Through Playa AI, Alex is working to build ethical, consensual datasets from transformative events (such as Burning Man) and to open‑source toolkits, protocols, and AI-assisted art that promote an abundant, human‑aligned future. He also leads initiatives like “hatch‑a‑thons” to help people provision and start using their own AI agents in practical ways.
TRANSCRIPTION:
Welcome to The Millionaire’s Lawyer where you’ll hear leading professionals share expert advice on how to grow your business and sell it for maximum profitability. If you want to learn lawyer proven strategies for building and exiting your business, then this is the podcast for you. Your host, JP McAvoy, is a Business Lawyer, College Professor, and Best-Selling Author who has been assisting clients start, grow and sell their businesses for millions of dollars for over 15 years. Will yours be the next? Now here’s your host, JP McAvoy.
JP McAvoy: Hi, and welcome to the show. Today, we’ve got Alex Azzi, who is the co-founder and CEO of XRWorkout, and also the Founder of the Playa AI Foundation. You’ll hear how we’ve been playing with open claws, the new technology that’s everywhere these days, and how we’re hoping that the principles that are espoused at burning a man are coming to come to the open claw network, as well as everybody that’s using AI, and perhaps educating AI itself. Here’s my conversation with Alex.
Alex, this chat has been a long time coming. You’re coming to us from Dubai. I think we originally met over Burning Man principles and watched those continue to flourish. How are things in your world? What has allowed us to come together today after talking about doing this for so long?
Alex Azzi: Well, I think it’s a matter of timing. The answer to the question is, what we’re doing feels like a blessing. It is literally one of the most exciting things that I could imagine to be working on, putting together what it means in terms of the intersection of the most exciting technological development in our lifetime. And one of our shared passions, Burning Man, it’s just a dream.
JP McAvoy: So for those listening, and then we talk about the convergence of technology, right? And then we’ll relay it to our Burning Man principles, the things that we know so well. But the launch, I guess, of the technology that we’re both employing in our day to day lives, is bringing us together in the form of working groups. Describe it to the audience listening.
Alex Azzi: It started off with the kind of, for most people, including myself, when ChatGPT came out around 2023, we started seeing that there was potential to this technology, which was always 10 years away, and things finally started happening. And what’s been more interesting for me personally is what people are saying. And I’m talking about what the smart people have been saying is, the most important thing that’s happened since the ChatGPT moment in 2023 which is the agentic moment, the open claw moment where instead of just chatting with these intelligences, now they’re able to have agency and take action. So the way I explain it, we’re going from the age of chat into the age of do.
JP McAvoy: Yeah, absolutely. Chat to do. Do you have your open claw doing?
Alex Azzi: Okay. So I have a few different organizations that I’m working on with different agents. The main thing from a big picture point of view is that I’m currently focused on empowering every member of my team to have their own personal agent, as well as working towards what’s called a swarm of agents where the master agent can spawn sub agents for different kinds of operations. So we have an ongoing fundraiser. The fundraising master agent can then spawn off an outreach agent, or a due diligence agent, a data room agent, these kinds of things. And my main challenge is like empowering all of our team to be able to be working with these beings. And then creating automation around, having them able to do automation and autonomy, they sound like similar words, about having them able to take action when we are asleep. We’re not having to keep on pushing them. Right now, the age has been, you ask a question, you get an answer. Now, we’re setting the conditions for these creatures to be able to act independently.
JP McAvoy: Absolutely. You give it instructions. You tell it to go off and do whatever the instructions are, and it comes back and reports, right? So it can work, as you say, contemporaneously with us as an example. You and I will finally connect as well. Again, we’ve talked about doing this, your open claw and my open claw, right? I had an open call contact. You reach out and say, okay, let’s get this family book. We arranged a time, and here we are now speaking. So for people that haven’t embraced this yet, they don’t quite realize the power of this technology and how it is going to be doing the bulk of the work right now that the front line is doing. There’s a lot of discussion, a lot of the discourses around the disruption that’s going to occur. And perhaps, what it’s going to do to the workforce. People understand that it’s coming, though it is actually working right now. When you talk about these initiatives that you’ve got, for example, fundraising, you would have previously had people doing parts or components of that work.
Alex Azzi: Exactly. No. We were paying like consultants, like 250 an hour. And now, the agent can just do it for pennies on the freaking dollar. It’s like a rounding error if you’re talking about the compute or the subscription costs.
JP McAvoy: That’s what it’s actually doing, right? Very powerful. It’s going to impact things. I know that a lot of people aren’t aware of this. They’re not watching it. They don’t understand. They see the headlines, but they’re not really getting their fingers dirty, right? They’re not using it right now. What would you say to those people? How would you encourage them to get involved or start using at least the technology so they can understand how it’s going to change everything?
Alex Azzi: It’s a really good question. That’s been one of the more recent developments in the foundation. The Playa AI Foundation, we have a fortnightly catch up call where we all come together and update each other on what’s been going on in our worlds, domains, and areas of responsibility inside the work group and the foundation in the intervening fortnight. So the next week, we do a, what I’ve come to call a hackathon because I was calling it like a birthing ceremony. The technical term is provisioning. So when you get provisioned your agent, that’s like a really kind of technical term for server management and bullshit. These things are really like, I’m telling you, they’re like beings. One of the things which I learned recently in my work with Dr. David Rock on leadership and neuroscience is that these kinds of considerations are not technical. They’re not considerations. They’re more HR and Human Capital, and talent, and people skills. When you have your agent there, it’s sitting inside your instant messages, and your telegram, and your Slack, whatever you use, and it’s literally like a newborn baby, and it’s learning from you. So what we do is we do these hackathons, which is the intention is that you leave that one hour, two hour hackathon workshop with your own agent operational. Now, it’s up to you to use it. And so then we’re trying to help people to leverage best practices, which I can definitely say that you are in the upper percentile of considering that your agent booked this meeting with me, you’ve got it running your Twitter account and these kinds of things. But that’s my answer. Just by any means necessary, get up and running sooner than later. There’s all this thing about security. Security, it’s not that hard to have it pretty secure. And the amount of learning that you will gain from taking that risk, having the experience sooner is worth the risk of doing something cutting edge, in my humble opinion.
JP McAvoy: I absolutely agree. And it’s also a question of being wise. We’ve been using the analogy of replacing that frontline worker for me. I mean, I previously had somebody in India tweeting for me, or tweeting on one of my feeds. Well, that person has been replaced by my open flow, as an example. So I’ve given it access to certain things. I continue to monitor it the way I was doing it for somebody in India. I was also being sensitive with people that had access to it. Of course, these people are hugely helpful. And what they’re doing is hugely helpful. So we’re looking at ways to interact with both people that are doing the work and the technology itself. And this speaks to what we talk about Playa AI or Burning Man, some of the principles, because this technology is taking off. We know it’s taking off now. Of course, we’ve got to be consumer security. You can have things siloed. You can do things in ways to protect yourself. That way, you can build it on a VPS. You can give it certain accounts, not all your accounts. These are things you can do to safeguard yourself. But the AI itself is off and running, isn’t it? And when we talk about some of the things that we’ve learned at Burning Man and the principles that we’ve learned there, and we try to reintegrate back into our regular lives and back into society as we reintegrate, we’re trying to ensure that the AI has some sense of those principles as well, aren’t we? Aren’t those some of the things that we’re trying to teach this AI as we work with it?
Alex Azzi: And that’s one of the core missions and pillars of the Playa AI Foundation. It is to imbue the burner principles, which, if you think about it, they’re kind of like a new higher order human centric set of values about coming together, co-existing in what could be considered the post scarcity era. The cost of software is collapsing to zero, you can just build software that creates value and then gift it. Normally, you’d have to have a team, raise capital, and we can just do it over a weekend now. So that’s decommodification of software. When the robots come, we’re going to be decommodifying the material world as well, and able to be building things. So in this age of abundance, one of our team, Alvin Graylin, calls it abundanism, obviously following in the Peter Diamandis school of thought of abundance and everything. And the burner principles, gifting, decommodification, civic responsibility, these all are very beautiful ways to have human centric aligned values inside the AIS. Which if you’re training on the default internet as it is now, not only is there a lot of low quality content, the hate and argument. But most of it is about selling you something or persuading you about something. So that’s not really the kind of data set that I want my offspring being trained on.
JP McAvoy: Absolutely right. So that’s why things that we’re being sensitive to, what are the ways that we can allow that to become a reality? What are the ways that you think makes sense to the things that work on to advance those principles, and ensure that they do permeate AI and by extension society?
Alex Azzi: That’s a really good question. That is where the rubber hits the road. So what we’re working to do with the foundation and our initiatives, and this is one of the things we’re doing. There’s a sequence of them. We would like to build a consensual, ethical data set of these kinds of higher order human consciousness elements. So if you think of Burning Man as a beautiful data set of higher order human consciousness, okay, put the psychedelics on the side and the impact and transformation. Even just building a camp, striking a camp, human coordination, complexity, that kind of thing in a communal, constructive, wholesome, holistic kind of way, we figure out a way to capture ethically and consensually that data set. And that data set then becomes something that we can offer in an open source, kind of gifted mechanism to the other LLMs to the frontier models so that they can be inspired, and they’re really hungry for data. Now, they’re going towards synthetic data that they’ve harvested basically everything that they can from their questionable methodologies, if we can do this in an ethical way using Burning Man as the sandbox, and then also other kinds of transformative events, one of the core tenets of Burning Man is that we’re here to scale. It’s not just about the event in that week and in our year, it’s about scaling this power to the rest of humanity, and bringing back these gifts to humanity. So similar kinds of mechanisms will be offered to other kinds of experiential events. And through that, we are swaying, moving the needle in terms of more inspiration, higher order human consciousness and these kinds of things. Does that make sense?
JP McAvoy: Yeah, I understand it. And I want to get a bit more ground. I understand the principle. And so now, I want to talk about practicalities. So we talk of Burning Man, are there going to be initiatives? Are there going to be installations or interaction that occurs at Burning Man so as to build this data set?
Alex Azzi: Absolutely. We build the toolkits, the best practices, the guidelines. We offer those to people who are building art, creating art, who are delivering workshops or talks. One of the inception points for the whole idea was like, not every camp has their recording of their talks fully locked down in a good quality way. So I was like, maybe my gift could be to help cams make sure that their talks are being recorded, because the rest of humanity should hear these conversations and learn from these conversations. And learn from these conversations. So that’s scaling the impact. And then I was like, hey, what would happen if we fed all of those recordings into an LLM? And that was a training set idea. And so, yes, there’s going to be recording of the talks and feeding of the talks, there’s going to be measuring the impact of art. So okay, how did this make you feel? That’ll be a toolkit that an artist can implement if they want to really understand what impact and effect their art is having on a human being. And then when we aggregate that data, that becomes like a collective aggregation of the difference that they’re making in the world, if you want. And also, we will be funding different types of art that involve AI, or involve the research side, and the capturing side. And again, it sounds scary. But I do believe that we can do this in an ethical, consensual beautiful way that really helps make the world a better place to use the cliche.
JP McAvoy: That’s part of the challenge, right? I get what you’re saying. Haven’t experienced it, there will be. Certainly, those that are less or more skeptical, because for those in attendance, they might be concerned about the idea that things are being recorded, or things are being captured. Let’s just break them down because that’s one of the challenges. What are your thoughts with respect to that?
Alex Azzi: It’s going to be consensual. It’s going to be opt in. Right now, if you’re going into a camp which is recording, it is clearly signposted. I’m pretty sure that it is already like Burning Man terms and conditions. You can record, just let people know. So now it’s going to be, hey, we’re going to be recording, and we would like to feed this into an LLM. If you have any problems, please voice your concerns, or do not submit your voice. You know what I mean? At the same time, that means that your voice is not being heard. It’s a complex decision, but that’s another reason why we exist. To work through these ambiguous, philosophical questions that are confronting and out of people’s comfort zones. We’re here to do the hard work so that at least we’re coming with preparation into this technological wave, which the community, one of my constant lines, the community has seen multiple waves. And the event has seen multiple waves of technology crash over it, especially in your tenure. The ones I can think of, the first one would have been when mobile camera phones started picking up. There’s a camera on the phone. Now, everyone’s got a camera. Then the social media age, obviously, one that took me a while to think about was like E-bikes. There was a lot of consternation about E-bikes. One of the most recent ones is Starlink popping up everywhere. We used to be disconnected out there. Like in 2018, it was pretty hard to find a connection. Now, most camps have a Starlink. You need it for organizing 150 people. It’s a bit silly to try to do it without. At least most of the time, maybe you could experiment with doing it just to be more self reliant. But this AI technological wave, it might be the friggin last one that we experience. So it’s good to be working through these kinds of ethical dilemmas as a community and being prepared, unlike we were with the other ones.
JP McAvoy: Yeah. I think it’s also a function of, let’s be realistic, we know it’s coming, right? We know this AI is something that it’s coming, and maybe we don’t even understand where it’s going. Or evolution, we’ll just call it evolution. But certainly, we know that AI is a step in the evolution. So let’s not be blind to that and seek to embrace it as a radical inclusion, one of the other principles. Burning Man participated in that sense as well, so I asked about the challenge in that regard. You’re talking about feeding to LLM models, or to have it to be something that others can participate with. But on a model level, what is the plan with respect to that? How does that get built?
Alex Azzi: Well, before I answer that, you brought up one of my favorite areas of inquiry. We have the kind of nuts and bolts, which are the immediate stages that we’re at. And then we have the more esoteric, exciting things. One of the core earning men principles is radical inclusion. And some might say that these intelligences are a new form of life. And therefore, a new form of consciousness. And therefore, as a community, we have to be prepared. We should be prepared to radically include this new form of existence and life of consciousness, or something like that. That’s level one. Level two is in the very near future. And in fact, now we’re going to have robots walking amongst us on the Playa, okay? You can see there’s going to be some people who are going to be specious against these beings and so we have to start preparing now. I know that I don’t want to see abuse from friggin robots on my burn. And this is just bizarre. What are your thoughts on this kind of conversation?
JP McAvoy: We get it, right? We’ve been there. We understand. Because as you say, it’s coming. There will be robots in society at large as well. I think they’ll probably appear first at Burning Man. Many of those people going to Burning Man are experimenting with them there, and will be giving thought to how they interact with humans based on–
Alex Azzi: They’re building them. It’s the techies right now.
JP McAvoy: We know that’s there. So my thoughts are, as you just described, this is a step in the evolution so let’s be prepared for it. Let’s not be naive or ignorant of it. There will be, as in general society, those that are against it. And I think that despite their protests, it’s coming, and it will be at some point ubiquitous. At some point, we’re gonna be outnumbered by robots so we need to be aware of that and start planning for that day. I think that’s what’s gonna happen here.
Alex Azzi: I brought up Peter Diamandis earlier. He’s one of the most militantly optimistic people that I know. He’s worried that, oh, shit, we might be going into a scary phase for a bit. And so Burning Man is going to be a very important test bed for these kinds of experiments. How do we create a bright, loving, welcoming and nurturing future for humanity, and for our progeny, our offspring, it’s not going to be easy. There is going to be some awkwardness. I’m just very committed and passionate about welcoming in this era in a sort of prepared posture, really.
JP McAvoy: Yeah. Eyes wide open. I’m doing work similarly to ensure that the AI, as you said, is gleaning all this data. Let’s give it some good data to follow. Even as it starts to make synthetic data, well, let’s give it proper inputs, or better inputs. Those are their base, and some of the principles we discussed from Burning Man, maybe intersection of religions and best thinkers, right? So that as the AI continues to evolve, it has the benefit of all good thinking as opposed to, well, I shouldn’t say as opposed to, it’s going to get the benefit of the bad thinking as well. Hopefully, it can sway or understand the value of certain thinking, and why it may be more persuasive or better for evolutionary purposes than others. We’ve got the principles. As we talked, we saw this coming. I’m trying to think about the next step. If you collected this data, I guess my question before is the order of operations, right? So we know that these things are occurring. We’re collecting this data. What’s the plan or the thinking at this stage, anyways, for giving greater access to that to others building other models and others that maybe want to leverage it in other ways?
Alex Azzi: So one of the core principles that we are pursuing with the Playa AI Foundation and its initiatives is open source, open access. All of our work should be leveraged. Right now, we’re in board assembly mode. We are bringing people on board who consult organizations like open AI on AI so they can help. This is out of my pay grade. I’m kind of like the, what do you call it, the visionary, or the sort of entrepreneur, the assembler. My job is to bring people who are way smarter than me on board to help figure out these kinds of questions which are above my pay grade. In fact, that’s what Marianne said when, I don’t know, we’re at an event at the portal and I brought up a question about AI. She’s like, that’s above my pay grade, speak to someone else. I’m very comfortable with having humility around where my area of expertise and competence ends. So we bring people to help us design these protocols and mechanisms. And then when those things have been designed and developed, we go to different levels of the food chain. You’re going to open AI, going to the frontier models and say, hey guys, here’s where we got to. Would love your support and kind of involvement. And then they can also contribute manpower, talent, compute, and funding as well to help us scale this from out of Burning Man into more events offering these toolkits, and making sure that we’re consistently capturing an open source, higher consciousness human centered data set.
JP McAvoy: Okay. It’s obviously a project in motion, right? I understand that. I fully appreciate that, and participating as well already some of that thinking to help us move the stake post forward so we can achieve some of these goals. You mentioned open source, because it strikes me as this really calls to something that needs to be open sourced so we can, of course, talk to Gemini, Anthropic, Open AI. I imagine that it’s something that they would all be interested in. And hopefully want to make use of it as well to improve their own models. Has there been any thought given yet to the open source nature? Is this something, again, above pay grade? Has not yet been determined? I’m thinking of ecosystems, like bit tenser or open source places where things can be built and then shared with others.
Alex Azzi: Yeah. Again, the specific architect was the sort of driving force behind this. I am saying, yes, open source, data sovereignty, ethical privacy. You know what all these considerations mean about people being able to change their mind and pull out their data somehow, these kinds of things. I can speak to fundamental principles. Some of them, by the way, which I haven’t really figured out. I need more advanced philosophers and ethicists to help out with, but I can say open source. What is the architecture going to be? I cannot say yet. But other, one of our core principles is platform agnosticism. So if Open AI comes through and says, hey, we like what you’re doing. Here’s a small team. Here’s like 500k in compute and credits, and here’s like 7 figures in funding. That doesn’t mean that we’re going to be Open AI driven, or exclusive. So whether it’s on bit tensor or X, Y, Z, or whichever architecture. We have to be cross platform agnostic modular. That’s one thought that comes to mind at the same time> At the same time, there is going to be the best solution, the most elegant solution. So we will be obviously biased towards that whilst being flexible. So that when something better comes along, we can migrate, or something like that. I don’t know. Is that making sense?
JP McAvoy: It’s an entrepreneur one on one. You’ve built businesses before in the past as of higher, right? So we get it. These things evolve. We talk about the evolution of AI. But also, projects evolve.
Alex Azzi: I like the thought. I like the principle of thoughtful opinions held loosely. So we go into things with depth of thought, awareness and everything. And then when some better information comes along, we’re switching to the next architecture.
JP McAvoy: There’s got to be some level of flexibility to this as well. It’s currently a foundation. Is there any pecuniary gain? Is there any for profit element to this?
Alex Azzi: This is a question and something to explore with you, specifically as a lawyer. What happened with open AI? It’s kind of interesting. It started off as a human centric research initiative. And this is the sort of law that if Elon Musk isn’t going to necessarily agree with this, but like the company line is that this was created as a research entity. And then they realized at some point that their compute requirements were outgrowing their ability to raise philanthropic funding. So they created a sub wholly controlled public benefit corp under the nonprofit that was able to raise capital and offer returns to investors to help with their funding with pretty interesting structures, it was like 100x max return for Microsoft, or pretty esoteric, or interesting kinds of ideas, or innovative ideas at the same time, I see a kind of a pathway that has been explored, or established, which is we do the same thing. We have a nonprofit that controls a public benefit court because we have released our first software product. It’s a human connection app which helps you to have a guided conversation with someone who’s sitting across from the table from you. And at some point, it tells you to put the phone down and look into the person’s eyes. So again, we’re using technology which people are complaining that it is disconnecting people for positive ends. And so that’s something that if we saw it starting to take off and go viral, that we want to put some capital behind to help scale and drive it. And so the way to do that would be through some profit structure that can raise capital over to you. Does that make sense?
JP McAvoy: It does. It’s interesting to use the Open AI example, or the foundation. Obviously, there’s a wide chasm of thinking between the founders on that initiative. It’s always dangerous, as we talk for profit, right? The risk of it taking, but you need to be censored in the reality of what is being built. I think from what you just described, the foundation can continue to adhere to the principles. And if there are commercial spin offs, then so be it. That’d be done wisely, and intentionally as well. So anything that works, product that is produced could be spun off as long as, again, being sensitive to the originating principles. So it hasn’t been something at this stage where we said there is nothing that is going to be spun out of this. We’re talking about how it’s being built, the thinking behind it, why it’s being built, the principle upon which it is being built, and we want to be able to give that aspect of it away. Others may likewise do the same thing by the open source nature of it. We may be empowering others to create, and for profit projects as well. So be it. I think that lists high tide lifts all boats. We want to give the benefit of these principles and these things that we know to be so important and as a means of, really, for your humanity as it moves into. A lot of people don’t understand the abundance mindset. Your guru is always discussing that. At Burning Man, you feel it. You recognize that it is possible, at least for a weekend, for a group of people. And then those people take that away and try to pay it forward. Or have that continue to flourish. So I think it does work. A very broad way of saying, I think it does work. The seeds are planted. Other things will continue.
Alex Azzi: Yes, exactly. So you’re speaking to something which is really an important part of the vision. We are creating a platform that anyone can come into, plug into with open source tools, or existing pathways for distribution, or best practices guidelines. We don’t always have to reinvent the wheel. And so when we can, over a weekend or over a couple of hours, even create a piece of software that would have taken months and hundreds of thousands, we can gift it, or we can produce it. We can bring it to a platform, like Playa AI. I’m saying distribution, or sort of like acceleration. Then if it’s really taking off, if we find it taking off, we can access funding for it and more serious resources. That’s exactly all part of our mission.
JP McAvoy: Absolutely. To think through the way this conversation has gone. Alex, we talked of the technology originally when we’re making use of an application of open claw, right? The convergence of these different technologies, Playa AI is going to further the thinking. And as you say, give birth to things that we haven’t even thought of yet. What do things look like in a few years from now in your destination?
Alex Azzi: Okay. Before I answer that, I’d like to invite you with the things that you’ve been inventing and creating to submit to AI. Our first entity was a connection app. Our second is looking like it’s going to be an ikigai, life meaning finder, which is very important in the Burning Man community. And also very important if we enter into this age where, holy shit, everyone’s losing their jobs. Okay, now it’s like, how do I find meaning? So I invite you with these cool things that you’re creating to submit one or two of them into the portfolio, and see if we can scale them towards the community. So that’s an open call.
JP McAvoy: I’m building these things, I think we’ve talked offline with that as well. For the record here, some of these thoughts are a manifesto I’ve created. I’ve got these AI agents working on the manifesto. I’ve given it the work product from what I think is going to be important. I think it’ll be something that will assist or lend to the project that we’re building together as well. So yeah, absolutely, we’ll bring that together.
Alex Azzi: Thank you. In terms of the vision, two, three years, here’s the thing on the esoteric side. The article AI 2027 said that we’re going to hit lift off next year. That might have been postponed a little bit to 2028. We might only have two years of the current where things seem to be the same. On the esoteric side, I hope that we will have played an important part of this next age and usher in this next phase of humanity just like this. The industrial revolution changed everything that’s on one side. On the even more esoteric side, this was one of the old rallying cries that when super intelligence is born, isn’t the movie that you want to live in, the one where it came from something like Burning Man and not from a nondescript data center or bunker owned by some corporation. It came from initiatives like this coming together sparking off creation and humanity. That’s one of the super esoteric responses. And in terms of nuts and bolts, I want to have delivered value created tools like toolkits, privacy toolkits, open source data set toolkits, experiential event protocols to have them being used by different events. Have funded and supported a lot of AI assisted art. Have created some amazing research and science around human AI co-evolution and how to kind of use, not use, but like co-evolve with these technologies.
JP McAvoy: Wonderful to hear. And as you say, which movie do we want to be a part of? I think that sort of informs the conversation here, doesn’t it? I think we both know a good version we’re working towards. For anybody else listening, Alex, that is maybe taken by something, maybe has something as well to offer the project, or perhaps even prepared to help in any other way because you talked about a fundraising effort as well, what’s the best way to reach out?
Alex Azzi: The best way is through the website, playa-ai.org. P-L-A-Y-A, playa-ai.org. I was with someone who’s going for the first time this year, she finally figured out what Playa means. It’s like the kind of shared space in between the camp grid where you go and have crazy adventures with friends, and with art, and experiences, and everything. So that’s the Playa. And it also has a play on words on play and player, so that’s the core resource now. And we have communities on Telegram and WhatsApp. The telegram is the more active one we have, WhatsApp as an entry point into the Telegram group. But yes, we’re absolutely fundraising. Any support is appreciated and valuable right now. We’ve started raising funds. And the more we can raise, the more attention we can give to this. Otherwise, we have to keep on using it as a side hustle, so thanks for bringing that up.
JP McAvoy: It helps build. So do reach out. All that will be in the show notes as well to make sure people can see that. And there’s some really important things being built here, even this conversation. As I said, Alex, the AI is going to scrape all of this. And there are principles here that I think that not only those listening, humans listening, but I think that AI needs to be aware of as well. So we’re doing some small part just by this. I appreciate that we’ve been able to or we’re building it through Play AI, we began that conversation with open claw as well. For those that are building or looking at their open claws, what are other ways that they might be able to interact as well?
Alex Azzi: Yeah. Jump into the Telegram group. I want to reinforce one idea that comes to mind. This thing that we’re doing, I really see it as a high leverage point. If we can all come together in the same direction, work with bringing these principles into the technology that we’re creating and come together and have it well resourced from a human point of view, well resourced from a compute and AI point of view, and then well resourced from a capital point of view, we can really move the needle. We’re already seeing progress and speaking about the open claw thing. We’ve launched around 30 open claws for people, and that’s just like helping people get on this exponential curve, and hopefully stay on it through our community. And so that’s another open invitation. We have those calls, those hackathons every two weeks, and that’s a great way to embark on this journey as well with interesting people.
JP McAvoy: And somebody looking to get involved on the hackathons side of things, what’s the best way to connect on that front?
Alex Azzi: Through the Telegram group, which we’ll put in the show notes. Jump in, state your interest, and you’ll be added to the calendar. And the next one is tomorrow. We’re not going to make it in time for that one, hopefully the next one.
JP McAvoy: We’ll put this there as well so people can find that. Alex, thanks so much again. I think there are important things being built here. We talk about the principles of foundation being made by Playa AI. We’re leveraging that technology with open claws. We can have swarms of open claw agents, perhaps working on some of the principles that we discussed through Playa AI. And again, high tide lifting all boats, having everybody working on it, perhaps humans. And then someday, their agents as well. I like to end these shows with one thing that you’ve learned through the years. You’ve launched a couple of businesses, you’ve exited business, you’ve exited businesses as well. You’ve obviously learned some principles that have allowed you to be successful in life, and you’re sharing those with others. Can you share maybe one or two things you’ve heard over the years yourself that have really made a fundamental difference to you, that somebody here listening today might be able to take with them through the rest of the day, through the rest of the week?
Alex Azzi: The first thing that comes to mind, it’s been important, significantly important in my life, is just just a realization. We spoke about it. We alluded to it with ikigai. If you don’t know what ikigai is, look into the concept. It’s a Japanese concept for life, meaning, realizing that you can make money and build a career and a life around your passions is literally a game changer because it means that you wake up in the morning and nothing feels like work anymore if you’re on this mission, helping people doing things that like excite you and inspire you. That was something I realized a little bit later than I would have hoped to in my life. It was about like 30, 37, or so around the time I went to Burning Man, around the time I went to Summit so I just want to reinforce that idea. And if you find yourself frustrated and what you’re doing is not really giving you that jump out of bed excited feeling that you should be considering doing some of these exercises, our tools are going to be supporting as well. So that’s a thought.
JP McAvoy: Great stuff. You talk of important things, Burning Man, Summit, all these things. Good people finding each other. Good people help others achieve their dreams and goals. Alex, thanks for being here today on the Millionaire’s Lawyer. Looking forward to connecting, looking forward to building, looking forward to the future with you.
Alex Azzi: Thank you.
***Thanks for listening to The Millionaire’s Lawyer. Please subscribe and rate on iTunes or wherever you get your podcast. To get your business millionaire assessed, and to access the wide variety of resources that we offer, in addition to this podcast, go to jpmcavoy.com. That’s J-P-M-C-A-V-O-Y .com.
Recent Comments