Designing for Our Relationship with AI
by StratMinds
- Full TranscriptDesigning for Our Relationship with AISara Vienna
I'm Sara Vienna, VP of Design at MetaLab. Not Meta, not MetaLabs, not MetaMabs, MetaLab. I'll tell you a little bit more about us. There's so many jokes that we have in terms of being stuck. It's ridiculous.
So I'm going to get personal real quick. Talking about relationships, talking about feelings, talking about user experience. I'm so happy to be here! This is like coming home for me. I don't know if you can hear that. Right now I live in San Diego, but I am moving back to the island of Oahu, which was my home since 2019. It's August, so very excited about that. Even cool to see some friendly faces from the purging tech community here today. So super stoked for all of it.
I also just want to tell you a little bit about me, because I think it's important to know why I'm listening to her. She seems kind of like a hermit. I don't see her on social media that much. What's going on? I've been a designer my whole career. I taught Photoshop in high school. I've been doing it for so long. Every iteration of what I've been doing in design has just been in service of making beautifully aesthetic things. Things that drive value for people's lives and make people happy. I don't really care about commands, really. So as soon as that starts happening for me in my career, I'm done with this career. I really believe in that as the driving force for why I wake up in the morning.
I started my career in graphic design. If any of you know the San Francisco Make Art culture, I came up with O'Reilly Media. I have a background in art direction, graphic design. I'm a typography nerd. I also have designed physical products, so I bring that lens to my design process. I've been designing products, honestly, since the App Store opened. Names and brands that you know and love. I've worked with founders. I've worked with celebrities, OMG, to design apps and experiences for them. That's why I come to you with this lens today. I think that's why they keep me employed in MetaLab. Very happy to be here. Thank you.
Okay, so I'm going to tell you what I'm going to tell you and then I'm going to tell you. So we're going to talk a little bit more about MetaLab. We're going to define relationships as they're happening in technology today. We're also going to talk about responsibility. I am very much a person who is about the responsibility, the choices we make as makers. I'm going to give you a few predictions, but I do not fancy myself a futurist. So I'm going to tell you a little bit about where design is heading. But I want this to be a conversation too. I love the engagement here. It's been so cool to talk and hear the speakers engage you too. You're all so quick and smart. So when I ask you a question, please answer. Cool.
Okay, so now you know why hopefully you should listen to me if not just to not make Richard Summer and Anton mad if you walk out. But MetaLab, who are we? We're a Canadian based company. Very proudly Canadian, but now we are worldwide. Everywhere from Australia and New Zealand through almost Eastern Europe, even though we do stand further sometimes. That tends to be our time zone right now. A group of 150 ish incredibly talented people, product product product only. Yes, our business model is agency, but we kind of issue that name. We really think of ourselves as your partner and we think of ourselves as your partner in such an incredible way because we have proven it.
We have a VC that Dave from our team is here today and is going to tell you about later in his panel talk. We invest in early stage. That was the founding of our company. We designed the first version of slack. We wish we invested in it. And so that is our core product and everything in service of research. So like I said, all we do is build products. We're talking about you probably have seen these names and logos. I'm not going to give you the full spiel because you don't want to be sold to. But I do just want you to know where I come from. Change. Right. Okay. Couple more. Let's see. More. I think it's on the video. I might have to just drive it. Okay. Cool. Oh, yeah. Look at that latency. Well, I didn't design it that way. This is supposed to be a bunch of logos.
So maybe we're missing some processing power here. But let me tell you that overall, we are very much in service of product only. And my focus is user experience, product design and brand. So that's the lens that I compare with. And we have reached so many humans across this globe. We say 2.2 billion users reached, but actually, if you count our work for New Bank, it's significantly more. We built them a design system. We have shipped 385 products and counting and built 12 unicorns. So we're really proud of these numbers. We continue to push forward in service of that, in service of making products that people truly value and that change lives for the better.
So this is our experience at AI and ML. I'm allowed to talk to you about a very little amount of this right now. But I am very proud of some of these logos that you know and love and who you're going to get to know and love. Shout outs to a couple people here. I think we might have some short-wave folks in the audience, which is really cool. Modular, Suno, Bardeen, Otter. I won't go down the line, but we're also very proud of the work that we've done with Val to see them to success in the journey. So I'm going to show you some work today, but it's all under the OK guys of showing you work because it's been launched or at least we have permission to show. But if you ever have any questions about any of the work that we do, I'm happy to talk about it one-on-one.
So let's talk about relationships. So you're in a relationship with AI right now. And I know that sometimes the word relationship can feel a little like how far are you going to go with this relationship concept? But every one of us, because we're in this room, is in a relationship with AI. It deeply informs our lives, not just because you're makers, but because you're users. And so when, oh god, this is not going to work. I think we've got to swap laptops because I have so many videos. Sorry. No, that's OK. All right. Well, let's see how we go because I don't want to swap out.
But I'm not the first person to tell you you're in a relationship with AI. Right. This is a TED talk by Lee Yearsley. She's incredible if you haven't read it or watched it. And I highly recommend it. And basically, like what she says in this TED talk is, I don't want to give away every detail, is that they've had to shut down multiple models because of the engagement that they saw from their own internal team creating a relationship that got too deep, too emotional, too fast. It scared them. So highly recommend a watch.
Now, I bring this to you. This is going to take a minute, but I'll talk anyway. That I do know that it's a little silly when we think about relationships and we think about this idea of, you know, movies like Her or Westworld. That's what this video is supposed to be showing. Let's see if I can hit play. There we go. So this is some CNN coverage. And what you're going to see is this is talking to Michael. Hello, Michael. He's telling you about this. Basically idea that we have artificial love and artificial relationships with AI. Right. And then it's potentially ruining like a generation of humans. And while this is very almost inflammatory, I would say, and some people might look at it in scoff. We are seeing that people are more and more open to the idea of engaging with AI and relationship and in that back and forth. Thirty percent of people said that they would be open to it if they were lonely. That's pretty significant in this day and age already.
And so even though this feels like the Hollywood version of what the potential could be, I do think it's important for us to look at artists, creators, directors, writers, and to try to create stories around relationships and to bring a little bit of love to it. I don't know if you all saw this lunch. This one was really funny. So this is angry GF and I swear I didn't even write the responses. But basically this is a column response of you trying to placate your girlfriend. And you know what I thought? That's misogynistic. I hate this. But then I unpacked it and I was like, oh, this is built by a woman. Okay, cool, cool, cool, cool. Shame on me. And what she's doing is building this to train you into confident response and being able to communicate in a way that's smooth over the relationship with AI. There's no winning by the way. Go try it. But I do think it's very interesting because it's a gamified experience to try to communicate with something that ultimately is taking a very stubborn stand. And so I really love that even though the execution is like quite wild and almost silly.
And so thinking about something that's a little more serious and you know I want to shout out we were unable to bring some friends here today who are actually behind this incredible product Hume. But I think that when you think about relationships and creating relationships with people, we need to think about the idea of how far are we going to answer for more pies. And the experience meaning like add that human characteristic to the experience. Do you want to talk to a human? Here's the lightly copyedited version of the transcript, preserving all original content:
Do you want to talk to a robot? Do you want to talk to something in between? I think that as we create these experiences, we should think of that nuance just like Sujin just talked about in her last talk of really understanding the choices that we're making when we're adding that personality layer to the experience because experiences in design are very much about personality.
This product is incredible by the way. I've watched all of them and I'm very excited about it. I think that the use cases are endless too. In those moments when we really need to connect in an emotional way, of course we want to leverage these tools, because we want to drive value for the person and we want to build that trust. But the question is, what does consent look like and where's that line when we're in that trust building exercise? We're dealing with humans and emotions are always going to come into play. You just have to recognize that when you design these types of experiences.
I want to point you to the advice guide to culture. If you haven't read it already, please do. I'm a vice fan from the days of yore during the do's and don'ts stage. Highly recommend to look up those too even though that has nothing to do with AI. But they still are producing very incredible thought pieces around what's happening to our culture today.
One of the things that they talk about in this report is this idea that AI is becoming so ubiquitous already across youth. Eighty percent of their audience is already using generative AI. Eighty percent. And they expect it to just get higher and higher. The numbers are going to climb. When you think about that, what's happening is the ceiling is being lowered. Access is being created for those with enough of an internet connection to be able to run something like this. We need to add more across the world to democratize that access.
You think about that ceiling being lowered. Honestly, what's that going to do? It's going to make it even harder to design these experiences. They have to introduce more differentiation, more delight. Every day we have to push towards delighting, challenging, and then driving value for that user. Because if we don't, they're going to forget about your product very quickly. They absolutely will. That's the risk that you run. If you don't invest in design, you run the risk of really losing the human behind that experience. That's what this is basically saying in so many words, is you really, really need to double down on creating that human connection.
Right now, the future of relationships is being shaped by the people in this room. When I was writing this talk, which is new, by the way, I was like, I really want to call out to you all and tell you how important that is. This is an incredibly important moment. We're at an inflection point in culture and inflection point in tech. I really want to ask you all right now, just a couple of people, what do you think our responsibility is?
My name is Arfi Gullian. I run a nonprofit research lab focusing on AI and social impact. I've been following this whole AI girlfriend thing for a while, all the way back to what Rappable was just creating, being part of internal Facebook groups and monitoring how people get addicted to those types of relationships. I think there's a reason why this whole trend is resonating, because we're so disconnected, we're so lonely, and we're looking for those types of relationships.
It's also fascinating that as human beings, we're resolving the need of the sexual satisfaction through a junk food of, let's say, sexual replacement. The thing that is really missing in our life is the actual convenience. The reason why AI girlfriends work so well is because they personalize to you. You live the life with them. That's the type of companionship that we're actually looking for in our permanent department.
But, and here's where the social impact things come in, they currently lack the objective function. Obviously, in our real world, it's procreation. But what is the objective function in the digital relationship? And who will own that objective function? Is this really scary question? So my question to you to counter-argue this, like, who should own these relationships?
I think you're asking all the right questions. I think the makers own the relationships. I think the users own the relationships. And it's a symbiosis that we're experiencing right now between that conversation, regardless of what the task is or the job to be done.
Anybody else have a take on responsibility?
Hi, Sarah. Good to see you. Leah, representing local Hawaii AI community. I actually know the founder of Replica Evgenia, who started her journey in AI by replicating a data acquaintance of mine from Moscow based on his text messages. He was a very well-known, very loved figure in the Moscow scene, innovation scene. So a lot of people missed his very unique takes on things in life. When she built this chatbot, a lot of his former friends were turning to it to kind of get his take on life once he was gone.
To answer your question, and I'm thinking a lot about what will the future look like. One of the best examples we can look at, I think, from the past is social media. Social media added a lot of delight to our life, expanded our social communities, allowed us to reach people who weren't able to reach. It also created the loneliness because we're no longer in touch with people who are around us. We now are talking to somebody on the other side of the earth instead of talking to our neighbors.
My take on this question would be we shouldn't solve every single problem that we see because sometimes the solution will create new problems. I think our responsibility is to evaluate those potential problems and think about the consequences of new products that we're building.
Thank you, Krishna. Did you get into my talk before? Sorry. Yeah, that's lovely. Thank you. I think that and asking the question, you're already on the right path. That's what this is about, because you hold power that can change society.
Like I was saying, AI is going to eventually be ubiquitous. It's persuasive, it's pervasive, and it's getting smarter every day. You all know that. I don't tell you. I'm the designer in the room.
I do want to talk about taking this back to a little bit of like the futurist like forecasting and get back to tangible. Right now, this tweet is very funny. Dave sent it to me. I work for here. He was laughing. He's like, Look at how ridiculous this is. This is like a silly conversation on Twitter about are we saying that something is powered by AI? That's like saying something that was like coded or it's like on the Internet.
I think that this is just a really good example of us in a moment where it's still such a novel idea for the majority of people. When we think about how we position it and how we brand it, ask yourself that question of how are we talking about AI in the way that we present it to the world? I think eventually it's just going to be ubiquitous in every product. I know that there are some people out in the world who don't like that idea, but ultimately it will be a driving force for so much of our culture.
We're using it every day already. You all know that. We've got some folks here from Metta and Exitst I think. We're trying this out right now. We're almost like launching these like sandbox like experiences to the masses. What we need to recognize, by the way, is this train is not going to stop. It's not stopping. This is from Sora by the way.
If the train is not going to stop, you hold the power in this room. What does that mean for us? Because I don't want the train to stop either. We're building companies and making money and building futures. I ask you what kind of future do you want? I really think that it's important that we all think about that when we're designing these experiences.
I think that we can build a future where technology and humans are aligned, where there is a symbiotic relationship that we can drive value to both. When we think about the future, we can build a future where technology and humans are aligned, that we can drive value to both. When we think about that, this is about asking us ourselves a hard question. Are we rewarding companies that are profiting off of short termism? Or are we rewarding companies that are actually looking towards the long term and rewarding the right incentive models? And then the whole question of what's right comes into play. But these are the conversations that we need to have as people with this immense power.
I know that we can. I believe in humanity because I'm forever an optimist. I cannot help myself. I think we need to change the model. I think we need to challenge ourselves when we see that model being flipped. I also think that when you have that power in your hands, you're the people who do it because who else will? Regulation will not catch up.
This is from our brand book for Next Data. I just love it. It's so beautiful. But this is about empowering humans, right? We want to empower people. We don't want to oppress. I do think that decisions we make can impact that. And so when we take a look, this is where I'm like, oh boy, here's where it fails. Here's the lightly copyedited version of the transcript, preserving all original content:
This is a slide where my videos are supposed to play, but I'm just going to go. It looks hot on the screen, I swear. This is about connectors and divisors. The things that connect us in this world are things that are positive, things that ultimately play out in a positive way for humanity. But there are also divisors. Some of the products that we've launched, built, and use every day can be both at the same time. When you're designing these experiences, really leaning into this idea of what will be a connector in the experience versus a divisor is an incredibly important one.
I'm sorry, those don't play. So Lee is basically saying that we can't sleepwalk into the future. We don't need another master. We need a mother. I really do believe that if we think of this in the context of our power and the caring and the wraparound that we want to add to our society and our culture, this is an incredible quote. Which is great. So we need to design those businesses with incentives that actually drive good, right? Make money, make good.
All right, Sarah, cool. So we know now you're an optimist. Like, whatever, got it. But what does that really look like? I think right now we're in such a nascent state of user experience and design. I'm going to show you a couple examples of what that looks like. You all know this formula. User needs and for our experiences and future sets, right? Jobs to be done. Drive business results. Pretty darn simple. That formula actually does not change. It doesn't change with AI either. When we think about rooting ourselves in the needs of people, that's how we design. That's the orientation that we have to start from. Because if we don't, people won't adopt your product period. Even if you're making mistakes along the way, that's okay. Those mistakes are learnings. But ultimately you have to be driving those experiences from the orientation of caring about these people and caring about the outcome.
I've got just three simple things for you to keep in mind when you're designing these experiences. First, really take an opaque and technical field from most humans and bring humanity to it. Be helpful. That's so simple, but it's actually really complex when you think about all of the tensions across doing right by the user and right by the business. Those things can have inherent tension. It's okay to explore those tensions. It's okay to recognize those. Don't just sweep it to the side. Have the hard conversation.
Oh, God, this is the videos. Okay, this might be where we have to swap. But basically, let's see. Okay. Yeah, okay. So the video's playing. This is Augment, and Augment is something that Strapmines and MetaLab is actually invested in, an incredible client of ours. Augment is a tool that is basically your personal assistant. It seamlessly integrates into your operating system, and it gets you ready for anything you need to do to tackle your day. Imagine being able to walk into a meeting and know exactly what you're talking about, every single file that you need, every single person you're talking to, the background and the context, having that all at your fingertips. When you have that, you can be so much more present, so much more in tune with the human on the other side of that important conversation, rather than sifting through all of your browser tabs or trying to find that email with all the things that you needed to do.
We're really excited about Augment. This is only the beginning, but incredible founders and incredible products so far. This idea that we really had to walk, this was a line we had to walk, was, okay, if we believe we're in a relationship with anyone, if we're designing for this relationship, how much information do we need from that user in order to bring real value here? We need basically access to your entire OS, almost every login that you have. If you think about the integration level here, if we're really going to drive and experience this like a personal assistant, the text, we have to have access to that information. When the team was designing for this, we really had to walk the line between privacy, consent, a little bit of that sort of dissonance of how much am I going to give away, a little bit of those roadblocks, so that we could build trust, so that we can drive the experience that ultimately is going to be something that people love and will pay for all the time. So that's Augment.
The next one is, you know, a value prop, old marketing 101, it has to resonate with people. But it also has to be paid off in the product. And those who cannot be disconnected, if they are, people simply won't adopt the experience. As we layer in AI, I just want to show you a little thing from this company that we work with called Tally. Dr. David Sinclair is one of the leading minds in anti-aging and longevity. He came to Matlab and he said, hey, I thought these are my models. I need to integrate them into an easy to use user experience so that people can basically like get this physical product so that they can test themselves and know how well am I actually doing? I haven't done this, by the way, because I don't want to. I don't want to know that I'm actually 52, even though I'm not. But Tally is... Oh yeah. Is that product line? Yes, it is. Yes.
We're really proud of Tally, of course. We brought off some of the numbers, but the engagement across Tally has been huge. The response has been huge. And it's only the beginning. What we ended up doing was really thinking about what is that value proposition that we're designing for? What does the experience need to do in order to enable that value proposition? And then how do we pay off in the product and also get out of the way? People aren't here to spend their whole day looking at their biometrics. They're not. They're here to make sure that they're doing the right thing so that they can go on with real life. And that's the difference. This is not just about eyeballs on the product. So we're excited about this one.
The third one is really putting yourself in their shoes. This is just a simple human golden rule, right? Do unto others. You have the power. You're the makers. Let's put people forward. Let's put ourselves in their shoes and let's design an experience that we can be proud of to know that we did right by the user.
This is a really cool experiment. The Atlantic is one of our clients and we built them this sandbox that allows them to pull all of their incredible, I think they have hundreds of years of content, be able to pull from that wealth of knowledge in the archive and bring it forward with an experience that feels really fresh and new and conversational. I'm such a nerd for this type of visual design. If anybody came up in that era where I was like, yeah, just go for it. Just get a little brutal with it. But this is a very safe place for people to be able to kick around in the product, understand the wealth of knowledge that is the Atlantic, and also be able to create experiments.
Here's where I go down the designer rabbit hole. But like, I don't know if you know Waldo Emerson is the founder of the Atlantic. And so the team ended up creating these really beautiful, almost like ASCII patterns from him and his profile, just to like really add that level of care to it. Like this is where aesthetics, graphic design, visual design, and artistry can really come into play.
I have three minutes. Someone's telling me I got in trouble by someone. Okay. So when you think of designing for this relationship, it's really three simple things. Bring humanity, pay it off in the product, golden rule. Very simple.
So I went through and I asked the MetaLab team and some XML people who are now with basically like every company who is making something interesting. I don't have time to read these all to you, but we will be able to show this talk later, give you a link to this. But basically what they're doing is they're giving you all advice. This is advice that they're giving directly to you. So this is a design lead at MetaLab. She's talking about co-creation of data set. This is our CTO talking about everything looking the same, talking the same, acting the same, because we're building off the old models, old equals old. This is Jackie. She's our director of talent. She's talking about the access that we actually have right now. Anybody can get into this field because it's so nascent. It's really incredible. This is Gunnar. He used to be on my team. Now he's at Perplexity. He's talking about really like how do you introduce simplicity and automation and transparency into the process. And then this is Cam, one of our design directors, and he's really talking about start with an honest human problem. It's the human problem that comes first, the solution then.
Quickly I'm going to give you a little bit of where I think design for AI is heading. And honestly if you ask AI where it's heading, you're going to be like, whatever. I try because I really had to test it out. I've heard this before. And so simply this is where I think AI is going. But I am not a futurist like I said. I am a present tense, very tangible person. I am a person who gets to work with the most talented designers in the world. And this is where things are coalescing for us right now. Right now there's so much content. AI is only going to create more content. It's just going to be exacerbated by AI. Here's the lightly copyedited version of the transcript, preserving all original content:
So what's going to happen is that we're going to have to curate. We're going to have to curate down to hyper hyper personalization with this content overwhelm. Because there's no more scarcity when those engines just keep trying to bring in content.
I think we're going to move from the attention economy to meaning. I think we already are going there, which is really inspiring. I think we are in a scale period and we're going to move to depth. The thing about depth is that designers are empaths. The people who are attracted to design are empathetic and they want to go deep on emotions and the human experience. So rather than building for scale, we will build for deep meaningful experiences.
We're going to go from fatigue - everything's AI powered, AI this, AI that - right to ubiquity. It's just going to become natural. It's going to become every single conversation that we have. It just exists. It is. We're going to go from a lack of access to actual education, which I'm very excited about, and then opening up and democratization of this technology because it's happening. And we're going to go from distrust to transparency. And I'm an optimist. I told you that. But this is really exciting if we think about the opportunities that we hold in our hands right now.
So here's a video that's not going to play. Wow. Here's the three. You got that shot. And I really want to tell you just imagine a better future. I know you can. I think that you have so much power in your hands. This is a video that was going to be a cool story. And I know that you can imagine a future faster. That future faster is really cool though because I think - like, I'll leave you, even though I don't want to quote Steve Jobs all the time. He did say that computers were like bicycles. Right. And I said, well, it's like a bicycle for the mind. And so if you were pointing that motorcycle towards something positive, you're going to make something incredible and make life better faster.
This is where I get my question. We're not going to do a question because we need to do Chris and then you guys have lunch outdoor and then you guys can ask questions. OK. So let's give questions. It was so good. All right.
Join Swell
At StratMinds, we stand by the conviction that the winners of the AI race will be determined by great UX.
As we push the boundaries of what's possible with AI, we're laser-focused on thoughtfully designing solutions that blend right into the real world and people's daily lives - solutions that genuinely benefit humans in meaningful ways.
Builders
Builders, founders, and product leaders actively creating new AI products and solutions, with a deep focus on user empathy.
Leaders
UX leaders and experts - designers, researchers, engineers - working on AI projects and shaping exceptional AI experiences.
Investors
Investors and VC firms at the forefront of AI.
AI × UX
Summit by:
StratMinds
Who is Speaking?
We've brought together a unique group of speakers, including AI builders, UX and product leaders, and forward-thinking investors.
Portal AI
Ride Home AI fund
Google Gemini
Metalab
Slang AI
Tripp
& Redcoat AI
Stanford University
Google DeepMind
Grammy Award winner
Portal AI
Ride Home AI fund
Google Gemini
Metalab
Slang AI
Tripp
& Redcoat AI
Stanford University
Google DeepMind
Grammy Award winner
Google Empathy Lab Founder
Blossom
Lazarev.
Chroma
Resilient Moment
Metalab Ventures
of STRATMINDS
Google Empathy Lab Founder
Blossom
Lazarev.
Chroma
Resilient Moment
Metalab Ventures
of STRATMINDS