The Flywheel of Trust: Personalization + Privacy
Finding a balance between meeting consumer demands for personalized experiences and respecting their data privacy expectations is tough, but it’s possible. Join e-commerce and publishing industry leaders as they explore how businesses can build a flywheel of customer trust that respects data privacy while enabling personalization.
Trishla Ostwal (00:11):
Hi everyone. I am Trishla. I am the AI and tech privacy reporter at Adweek. And before we get started, I would love our speakers to introduce themselves. Why don't we start from the extreme? Yeah.
Cathy Polinsky (00:25):
Great. Hi, I am Cathy Polinsky. I'm the CTO of DataGrail. I've had over 20 years of tech experience working for many large companies like Amazon, Yahoo, Salesforce, Shopify. I worked at Stitch Fix with Julie, and I worked for a lot of companies that really cared about data privacy. But even despite being really mature, amazing tech companies, it was a hard problem. And that's what really drove me to DataGrail.
Jess Hertz (00:58):
I'm Jess Hertz, I'm the GC of Shopify. I feel confident that no one at this conference is going to ask me about Spotify. I can't even tell you how many times people come up to me and say, "Hey, what are you listening to these days?" And I have to explain Shopify is not the same as Spotify. So I work at Shopify. I've had a ton of different jobs over the last 10 years. I've been in and out of government for most of my career. I started out as a regulatory lawyer. Okay, show of hands. Does anyone know what OIRA is?
Oh my gosh, I got one. Okay, so it's the regulatory agency in the Office of Management and Budget. So I started my career in government in and out. I've been on the private sector side. I was at Meta where I ran their North American regulatory team, which we can talk about. And I was a partner at a law firm, which you will see this picture of mine, which I was eight months pregnant. So there you go. But I'm super excited to be here with you. I've been at Shopify for almost two years where I met Cathy and delighted to talk to you all today.
Julie Bornstein (02:03):
I'm Julie Bornstein. I started my career in sort of traditional e-commerce with bigger retailers, so Nordstrom and Sephora, and then I joined the board and then full-time at Stitch Fix where the way we use data was totally different from how retailers had traditionally used data. And Cathy and I overlapped. She was the CTO there when I was there. And then I started my own company called THE YES, we were acquired by Pinterest last year, and now I'm working on my next thing.
Cathy Polinsky (02:38):
All about personalization though.
Julie Bornstein (02:40):
Yeah.
Trishla Ostwal (02:42):
Okay, amazing. All right. So I've spent some time reporting on the moral reckoning of the ad industry in the marketing world, especially after the Cambridge Analytica scandal data leak that took place. Ever since we've seen this sort of like a federal privacy movement for a federal privacy bill, but then nothing happened. And what instead took place was this complex sort of like a patchwork nature of statewide privacy laws. And navigating that and keeping up with the compliance from my understanding is quite a challenge. So why don't we set the context just in terms of how have you addressed privacy over the last eight to 12 months at your company? Jess, we can start with you.
Jess Hertz (03:31):
Sure. Well, as I mentioned, I'm a regulatory lawyer by trade, so privacy is something near and dear to my heart. We can talk a little bit about my time at Meta where I worked on some of the FTC settlements that happened. So I've had regulatory issues, privacy issues core to my experience, and I have certainly brought that into Shopify. So we have a pretty robust privacy and compliance program. We work a ton with our privacy engineers, and we've been talking a lot about a blend of legal and technical products.
We are in a little bit of a different situation. We can talk a little bit about on the panel in terms of our positioning. So as a platform, what we enable our merchants, what our North Stars are a little bit different I think than Julie and Cathy's experience. So we can talk about that. But it's hard just to say it. It's hard. It's a challenge. And we heard in the last panel about this ever evolving shape of privacy, and I think from my perspective, that's an opportunity, right? It's a kind of newness around privacy and that is a challenging thing. But I think from the spirit of good actors and good intentions and trying to solve hard problems, we come at it from a pretty creative landscape to try and think about what we need to do globally, what we need to do locally, how do those things interact, and putting together a comprehensive strategy is something we spend a ton of time on. Well, excited to talk more about that.
Trishla Ostwal (05:05):
Got it. Julie, same question.
Julie Bornstein (05:10):
I come at it from a non-legal standpoint. I come at it from what's the most data we can collect humanly possible and in the spirit of helping create a good consumer experience. So the businesses that I've been in have been about collecting data in order to give you good recommendations. And my experience has been that obviously you want to ensure to the user that the data you're collecting is specifically for the use in the recommendations for them and the experience and won't be used for other things. So that's just a baseline.
But what's interesting is that the amount of data you collect is one that I think requires trust, and it's a sort of ongoing process where if you ask a question and you're saying we're asking the question so that we can create a better experience for you, you better create a better experience. And so you stop being able to collect that data if you're not delivering on the experience. And so I would just say from a baseline, and I think we'll get more into it, I think the purpose for the data collection and the communication around what you're doing with it and then the ability to sort of show that you're using it is critical if you want to be able to continue to collect that data.
Trishla Ostwal (06:42):
Got it. Cathy, for you, what are some of the things or challenges that clients talk about, especially when it comes to compliance?
Cathy Polinsky (06:51):
I think that, like I mentioned earlier, companies really want to do the right thing, but it is really hard to navigate the changing landscape of laws that are happening. They want to make sure that they're staying in compliance, but also doing what they can to build really amazing personalized experiences. And I think that the complexity of SaaS really adds a lot of challenge to a lot of customers because we want to use the best tools, the best software out there, the best AI, and yet it means that every single application or technology or platform you're using, you have to really look and extend and see how they're managing the data that you're sharing with them.
(07:40):
And so we see a lot of folks that first off are really making sure that they have a good view of all of the software and data mapping that's needed to understand what they're using today to have some lightweight processes to evaluate new software, and then to be able to manage all of the complexity of the sprawl of data.
Trishla Ostwal (08:03):
So I guess there's this ongoing dichotomy for a couple of years now about how do you maintain a campaign performance at the same time audience growth while addressing privacy. So Jess, I would like to hear from you as to how do you balance both of them?
Jess Hertz (08:22):
So as you ask that question, this Walter Isaacson quote pops into my head. So tell me if anyone's heard this, right? Which he's been talking about Elon Musk and this question of we're becoming this nation of referees versus risk-takers. Has anyone heard this yet? No. Okay. Well, I think it's a great quote. And I am coming at it from the legal perspective, but I also run the comms and government affairs team so I have this sort of interesting portfolio at Shopify. And as we think about balance, to me, it really is this balance of wanting to empower innovation and empower risk taking. We all are at companies where that's rewarded and that's necessary. And so it is a little bit of an ethos too, especially coming from the legal perspective, how we communicate with our partners, how we communicate internally really matters.
(09:15):
And so I think a lot about how we do our crafts really differently at Shopify. It's something Cathy will remember, we talk about a lot. What are we doing to be unorthodox and how do we come at this practice in a different way? And so when you talk about this idea of growth, to me this is ... and particularly in the sphere of something like privacy, it is really trying to think about we don't want to be a nation of referees here. You can get stuck in many, many different kinds of patchwork of privacy laws. If you're trying to map every specific thing out, you're going to spend your time drowning in compliance requests.
(09:56):
But if you lift up and think about, "Okay, what are we actually trying to enable here?" We want to be the empowering for risk takers. We don't want to be those referees on the field while understanding, of course, we want to come at it in a really respectful and thoughtful and compliant way. It's as much an approach, I think, for us at Shopify or for me personally than it is anything else in how we lead our teams and how we sort of think about how to manage those issues.
Cathy Polinsky (10:25):
It's a company of entrepreneurs focused on serving entrepreneurs too. So that makes a lot of sense.
Jess Hertz (10:30):
For sure, for sure.
Trishla Ostwal (10:32):
Yeah. Julie, from a brand perspective, how do you balance both these ongoing ever evolving wheels?
Julie Bornstein (10:42):
I think first you have to start from a place where you trust your colleagues and you trust the approach of the company. So I think if you feel like you're clear that everyone's in it for the right reason and doing the right thing, then it gives the team the freedom to experiment and explore. And so I think that one of the interesting things is there's sensitive data that people will share if you ask them that information. The level of trust can get very high. So one of the things that Cathy and I both saw at Stitch Fix is as we would ask questions and then there's always a form people could fill out to tell us what was going on with them as they were getting their next fix, which is a box of five items for them. They might tell us they're pregnant before they've told their families, hopefully not their husband.
(11:49):
And so I think that there is, and the reason is because that data is not going to be shared. It's not going anywhere, and it's needed for us to be able to get you the right clothes. So we did maternity and we could start putting you along the cycle of early and then late in your pregnancy. And I think that there was a very strong code in the company around the value of that information. And so people took the information that was shared very seriously, and not that there was anything natural to do it in any way outside of the product we were creating, but I think there was a culture of trust and a culture of confidence. And so what we saw as people would share information and then sometimes we would do things like send them flowers when they had their baby or those kinds of things. It actually is an opportunity to create a really strong relationship with your consumer. And so I've always thought of gathering people's data as something that is, they're entrusting you with information that is yours to treat very carefully, but is also a great opportunity to really connect with your consumer in a world where that's pretty rare these days.
Trishla Ostwal (13:17):
A follow-up to that. So sensitive data, it's such a hot topic right now. So when things in the legal and the regulatory front change, especially with Roe v Wade, did you feel that within the company? Were your customers hesitant to share that information?
Julie Bornstein (13:37):
I don't think customers and the average person is so focused on the legal landscape. So I think that most consumers are pretty smart and they understand what they're facing and different services and businesses use data differently. And so I think there's a general understanding of that. I think from the legal landscape, mostly what the company did and then the companies I've been in since is just made sure that they really understood it and that the way the data was set up and stored and protected met all of the criteria.
(14:15):
But I think that consumers, obviously, there's a little bit of a shift in consumer's mindsets in part I think because of Facebook and some of the other things that have happened. And I think we should ask about that because I think that's super interesting what happened at Facebook during all the changes. But I think that it doesn't affect every business in the same way. And I think it depends on what business you're in and the way you communicate to the consumer about what data you're collecting.
Trishla Ostwal (14:45):
Perfect segue into my next question. Jess, do you want to give us some sort of ... paint a picture of what it was like working at Meta at that point when things broke the internet and of course dabble into how did media shape the narrative as well?
Jess Hertz (15:02):
Yeah, I mean I think it's a pretty interesting topic to think about. We've talked a lot. I'm happy to do that. I was at then Facebook now Meta at the time that Cambridge Analytica happened and some of the sort of aftermaths of that. And we don't talk a lot about it for some reason, but Cambridge Analytica really did shape in a lot of ways the course of privacy today. And we talk about regulators, we talk about legislators. I do think company plays a big role in that. Places like Facebook, things like Apple, AT&T, these are sort of changing the course of industry and people's actions and behaviors. I think the media has a huge role in that too. So understanding how the media covers something, how the media is pushing a narrative, does that change behavior? Right. That's a big question.
(15:51):
And I think certainly around Cambridge Analytica, the sort of media onslaught of that breaking the internet makes me think of the Kardashians. But Meta too, I think that it really did change the course of what we have today in terms of privacy. It created a zeitgeist around privacy in a way that we're seeing today with AI, in a way that we're seeing in different way, shapes and forms. But it did, without legislation, really changed the course of an industry. And I think that was as much to do with the FTC and their response to Cambridge Analytica as well as the media and how they covered Facebook at the time and what that relationship was and remains to be.
(16:40):
So I think it shouldn't be underestimated as a player in the toolbox or ability to shape the course of what's happening on the ground. So I think you've got some pretty interesting constituencies that overlap in a lot of ways in a Venn diagram. And I think Cambridge Analytica is a perfect example of how all of those came into play at the same time and really did change the course of what happened next.
Trishla Ostwal (17:06):
Yeah, kind of carrying that conversation forward, can you tell us some of the interesting challenges or even unusual challenges that you've recently faced at your company in terms of privacy?
Jess Hertz (17:21):
In terms of privacy? I mean, I'll take that in a slightly different direction, just carrying forward some of my last comments. I think as I've been listening to some of the conversations today, what I've been also thinking about is, what are the different incentives that players have in terms of what they bring to the table? And let me unpack that a little bit. I do think from a government perspective or from a legislative perspective or even from a regulatory perspective, you're essentially optimizing around consensus.
(17:50):
You have to do something. We were talking about MCCPA and CPRA, and you're required essentially to build consensus to pass something. And I think in the private sector, in companies, and I'd love both of your takes, it's different sort of what I'm talking about with the Isaacson quote, where you're able to take risk in just a very different way and you're able to optimize for consumer behavior or you're able to make choices around what you want to optimize for in a very different way than, for example, if you're trying to pass legislation.
(18:23):
So as I think about responses to different changing regulatory landscapes, yes, it's how do we comply with a new set of laws, but it's also what are we actually optimizing for our merchants, for their consumers? How do you take into account a different set of behaviors and incentives than you do when you're, for example, sort of sitting in a government seat, which I've sat in for quite some time as well. And so it's just a different set of considerations. So I don't know Cathy or Julie, if you have similar thoughts or different thoughts on that.
Cathy Polinsky (18:59):
Yeah, there's so much to go through on that question, but I would say piggybacking on what you were saying earlier, some of the biggest challenges is balancing innovation with risk and really thinking about that in the context of DataGrail and being the stewards of people's data is important. I'd say that that's something that just has carried throughout all of the companies that I've been in. I was at Yahoo at a really interesting time when we were using the first machine learning algorithms and search and thinking about how we can use data to help advertisers understand their audience. And Yahoo really cared a lot about privacy at the time. They had a group called the paranoids that were responsible for understanding how that they could be good stewards of data.
(19:47):
And so it really took champions within the company to understand how can we still be innovative, how could we use technology responsibly, but really build amazing experiences for our customers? And so I think that that is just a changing landscape with generative AI and a lot of the new technologies, but it's something that we have to always be vigilant in thinking about.
Trishla Ostwal (20:13):
Can we dabble on that a little bit more? Just like from a publisher perspective, what are the challenges and how do they really ensure audience growth while having all these privacy stuff to deal with?
Cathy Polinsky (20:30):
Yeah, I mean, at the time we wanted to let people know how they were thinking about share of voice and where they get traffic more from the East Coast or the West Coast and different geos, different genders, but with small publishers or websites, they might not have enough traffic that you could then identify an individual based on geo and demographic information. And so it's making sure that you have a big enough audience size and are not exposing more information than you want to when you're providing these types of tools and support. And I think that that's also important when you're using information from an AI perspective too, of making sure that the data that's coming in and out is of a big enough pool if you're trying to de-identify folks that you don't carry enough metadata with it that it could be exposed.
Trishla Ostwal (21:26):
So Julie, you spoke about how audience sometimes are willing to share really sensitive information and things like that, which I think hindsight for the company is a good thing because building a good solid first party data. But aside from when you see those things happen, I'm sure there are challenges that you also as a brand must be facing, especially now that we also have cookie deprecation and everything that's happening within the realm of tech. So what is something unusual that you've seen?
Julie Bornstein (22:01):
I think there's sort of a couple of different things I've seen. One is I had a really interesting moment earlier this week where my son and I, he's 17 and we were talking about a song that he loves, and then he literally opened Instagram and then there was a video with the song playing. And I said to him, you can turn off listening, there's a feature in privacy on your phone. And he was like, "No, I like seeing that." So that was a moment where I just think as you ... I always think of things in the consumer lens. And so I just think the mindset of people varies very greatly, and I think as you're collecting data, understanding there's going to be a big range, being able to address the range, giving people options.
(22:53):
One of the things we're thinking about right now in my new company is, so there are two different fun topics. One is if someone gives you information about their body, so they say, I have thick calves, or I have ... The concept is a conversational stylist who will then make recommendations for what to wear for men and women. And so the idea of how to echo back that data, so because you have thick calves or because you ... There are sort of ways, and they're probably even more sensitive topics like I have flabby arms or whatever. And so you're like, because you have flabby arms. So there's sort of a whole, I think, thought process around how you take the data and then share it back, trying to be very transparent but also not offensive. So that's one thing we've been thinking about. And then the other thing-
Jess Hertz (23:51):
I'm willing to be part of your sample size. Just putting that out there.
Julie Bornstein (23:56):
Awesome. We'll add you to our list. But the other one is part of the platform that we're building is social, and so you can get suggestions from your friends or you can ask for people's opinion. And so the other conversation we've been having is, if someone's saving a bunch of bras or maybe they're saving presents that they're going to buy, you think about buying for someone, there's all these, do you make things private and then they can share it, or do you make things public and then they can make it private? So I think that there's a lot to think about in terms of even just from a pure consumer controls perspective, how you want to create your product.
Jess Hertz (24:47):
And the one thing I would say on top of that is we think a ton about UX layers. So I mean, it's similar to what you're saying, but not only the controls, but how you communicate and what that UX layer actually looks like actually ends up driving a huge amount of what happens. And so certainly part of my learnings at Shopify as well in terms of how much we deeply care about that UX layer and exactly what's happening and exactly what people are seeing and exactly how we facilitate those choices is something that ends up being incredibly important.
Trishla Ostwal (25:22):
And from a consumer perspective, how do you convey such a complex message in a seamless understanding manner, especially given that, okay, California residents have one way of, because they have different privacy rights, you talk to them in a different way as opposed to someone sitting in New York. So how do you address that? How do you build trust with consumers and make sure that they're understanding what you're doing is actually for their own benefits?
Julie Bornstein (25:55):
Yeah, I mean it's sort of back to Jess's point. It's all about the visual and text experience that you're sharing. Every site has its privacy policy and it has ... what a lot of companies are doing is just taking the most strict version and applying it to everyone if they can. So there's California, then there's Europe, and so the standards are different. So I think for those who, I think there's a real value in making the privacy policy readable, not making it pure legalese so that if people actually want to check it out, they can take away good information. And there is so much crap out there. Crap is not the word I intended to use, but I couldn't think of a better word, but questionably trustworthy sites and experiences.
(26:52):
I'm a very un-paranoid person, although I did turn off listening for Instagram, but also, I rarely go and read a site's privacy policy. But if I don't know what that ... if it's a new service or experience or website, I will try and figure out if it's legitimate. And I do think that there's ... we used to have these very long privacy policies that were very hard to read and understand. And I think there's a lot of ... One of the things we did at THE YES, my prior company is we just actually took a ... rewrote it in a way that felt really understandable, and there's a lot of value in that. So I think it shouldn't be underestimated. And then I think just the information you give as you ask a consumer question and you tell them what you're doing with it and you explain what's actually happening, I think is also very valuable. So a lot of it does come into the layer of what the consumer is actually seeing and how you've designed the flow to share information.
Cathy Polinsky (27:52):
So I'm not a very trusting person with technology, and it's really interesting because I knew a lot of friends who used Stitch Fix before I started working at the company. And I went to the site probably a half a dozen times and started looking at the ... How many questions were there?
Julie Bornstein (28:07):
43.
Cathy Polinsky (28:07):
43 questions about what parts of your body do you want to stay covered or not, how much do you weigh and things that I wouldn't share with some of my best friends were going into this form. And so I think it goes beyond just the privacy policy that many people don't read about how you're building trust with your consumers and making sure you can in plain texts understand how that information is being used. And it also comes to the reputation of the company that I think Shopify has also done really well with payments for a wide number of very small businesses because they've built trust that their brand of shop pay is going to be using the credit card responsibly, that you won't have data leakage. And I think that for smaller companies out there that don't have a long reputation, don't have a long history, it's harder for them to start building that audience and trust and start getting people to share with them.
Trishla Ostwal (29:08):
Because sometimes you also have in media, when the whole Sephora thing happened, people were shocked because Sephora, I mean, I personally shop at Sephora and I was like, "Oh, I'm going to write about this, but not good." So how does a brand, not just Sephora, but how do other brands address that concern that's there? Okay, they're collecting my data and it's good because, well, I'm going to get products that I like, but there is always this sort of like, but what if things go wrong and data leaks is happening every other day. So how do you address that?
Julie Bornstein (29:44):
I'm guessing most people don't know the Sephora issue, so maybe you can share it.
Trishla Ostwal (29:47):
Well, it's been a minute, but hopefully I get this right. But Sephora, I think they sort of bungled the CPRA laws. They collected people's data without their knowledge, and it led to a hefty fine. And it's probably one of the big cases that's come out of the CPRA rules, but not the only case. I feel like there's more that's been happening in the background. So yeah.
Jess Hertz (30:13):
I mean, I think this also goes back to the way you build products. So every company's going to have its own way and own culture in terms of how you build products. And some of this is about infusing privacy as a product principle to some extent. And that's going to vary depending on the type of company you're at, depending on the type of data needs you have, depending on if you're a processor or a controller or depending on just what you do with data and what data privacy and privacy really means to a company.
(30:45):
But I do think there is the point of all of these changing laws and landscapes is to infuse privacy as you are building product. And so I think in terms of overall takeaways, and maybe Cathy, you have a different view on this, but it has become something where there are privacy engineers that we have. There's sort of infrastructure that gets built in and discussions that happen. And some of the ... we were talking about the Meta FTC obligations. These are in order to infuse privacy at the product level. And that is really the change that it's inspiring.
Cathy Polinsky (31:24):
And one of the ideas just around this panel is just you think of privacy being a indirect conflict with personalization and innovation. And what I found in my experience working for a lot of consumer companies is that it's the opposite. That in order for you to build personalized experiences, you need to build trust with your audience. You need to build trust with your brand. And every time you listen to what they're saying, honor their interests and are able to give them a better experience because of the information you're collecting, you create that flywheel. They'll want to share more information because they know that you're going to use that well, and you're going to be a good steward of the data and that you're being responsible for it. And so the more they share, and the more that you use that well, the more that we're wanting share, and it builds really good trust, but you can break that in a moment's notice, and it is very hard once you break that trust to earn that trust back.
Trishla Ostwal (32:32):
This is probably the longest I've gone talking about privacy, and no one's really thrown the word personalization in the mix. 30 minutes, that's what it's taken us. Okay. Personally, I don't like the word, because what the hell is personalization? But I would leave it to you experts to break it down. What does personalization mean from a brand perspective, from DataGrail perspective as well? Yeah.
Julie Bornstein (32:59):
Yeah. The idea of it is, it's the promise of the web, which is that each user can have a different experience based on their interests and needs. And so it's taken a very long time, I think to materialize and still the majority of things we see in certain sectors are one size fits all. I think media has been really the biggest leader in personalization. And you see it in, I will say it in a different context. I think Spotify was one of the earlier, not Shopify, was the example of personalization done well, which is let me tell you about my music preferences and then you're going to fill me with all sorts of new things that I would ... And so I think that I still believe in the concept of personalization.
(34:00):
I still think it's very valid. I think it will continue to ... I think it plays out in scary ways too like in news feeds that feed people more of what they want to see or react to. So I think we've seen lots of different examples of it, but I think that it is ... it's really what the promise of the internet was, which is data and technology can serve each person up something that is different and relevant to them.
Trishla Ostwal (34:32):
Jess, would you like to add to that?
Jess Hertz (34:34):
I mean, I think you're spot on, Julie, in terms of what that means. I'll go back to one of the original points I made, which is personalization has gotten this bad reputation to some extent in the industry. And there's gravity that comes with some of these concepts, particularly in the sort of privacy and data and security space. And I think personalization has gotten tagged with some of that to some extent. But I totally agree with Julie in terms of what the concept behind it means. And I think as we think about personalization, there's all kinds of other questions there about where a company's ethos or where a personal philosophy is on choice and what that means and how that intersects with something like personalization and people being able to customize and choose.
(35:28):
And how that gets infused into a company or a product really is something of a philosophical question. But I do think it has gotten tainted with a lot of negative connotations as a result of either circumstances in the industry or other kinds of potential legislation. There's all sorts of competing forces here, but that's just a little bit of commentary.
Trishla Ostwal (35:56):
Cathy.
Cathy Polinsky (35:58):
No, I think they said it well. I think it's just a lot of Julie's ... Nordstrom was known for being personalized experiences in the physical world before even the rise of the internet. And really how do you recreate that one-to-one interaction that you have face-to-face events like this into a digital world and I think that that's still evolving.
Trishla Ostwal (36:32):
We're reaching the tail end of this conversation, but I think a good thing to leave the audience with is just, how has measuring success changed for you? And if you could share any examples or data, I love data, anything out there for us to understand, okay, this is how things have evolved, that would be great. We can start with you, Julie.
Julie Bornstein (36:53):
I mean, I do think there was a sea change around the time of the Cambridge Analytica issues with Meta that people started to think about their data differently. And I actually think so much of how we think about our data is really tied to what is the product of the company. For Facebook and other social media businesses, our data is their product. And so that is what the company was built around, that is how it operates and doing things with that is what their business is. And so I think it depends what business you're in as to how to think about what you're doing and what to share with a customer.
(37:52):
Whereas I think from a customer perspective, first of all, I think one thing on the personalization piece is when it's done poorly, it's extremely disappointing and sometimes angering. And so I do feel like part of the reason maybe you have a negative feeling about it is that it's-
Trishla Ostwal (38:11):
No, no, no, I just don't like the word. I have no feelings about it.
Julie Bornstein (38:16):
But maybe you don't like the word because it doesn't always deliver, and it's a high promise and a...
Trishla Ostwal (38:24):
Perhaps, but also when talking to marketers, I'm sure this is just based on my reporting, it's like one marketer means something else, and for someone else it's a completely different thing. So I'm like it's such a bully in the marketing world.
Julie Bornstein (38:36):
Overused.
Trishla Ostwal (38:36):
Yeah.
Julie Bornstein (38:37):
Yeah. Yeah, I think that's fair. I mean, I guess what I would say is ultimately I think how you think about data you're collecting for your consumer is very correlated to what business you're in. And so talking about collecting data if you are at Stitch Fix is very, very different than collecting data if you're at Facebook. And I think that distinction then leads to different things, both from a consumer perspective and from the company's perspective.
Trishla Ostwal (39:07):
So at Stitch Fix, having the sort of questionnaire format for your consumers, how has that materialized into success for the company?
Julie Bornstein (39:20):
Well, the company uses that data to make recommendations. And so I think what it does is it does two things. It helps improve the quality of the fix, the items that are selected, and it also creates a forum for a conversation with the customer, and it gives the customer a chance to feel heard. And when we're not paranoid about what's happening with the data, we all just want to be understood. And so that's a very human need. And so I think if you can build a relationship where you can make a customer feel heard and then deliver on that experience, that's a really positive one.
(40:07):
One of the things we found was you would think intuitively that the fewer questions you ask, the more likely you are to convert someone at the end of the process because they can get through it faster. But we actually found when we added questions, if they were questions that people felt like we should be asking them in order to give them a good fix, that it actually increase the conversion rate at the end of the series of questions. And so I think that that speaks to the human instinct of wanting to be understood and wanting to be able to share all the information that's relevant for someone to do a good job for you. And I think that it takes the right UI and it takes the right wording and the right reputation and then the right delivery of the service to build that trust over time.
Cathy Polinsky (40:53):
And then for those of you who haven't used it, Stitch Fix asked you a profile question and they match you to a stylist, they curate a box and send it to you site unseen. So they trust the recommendations and personalizations so much to the fact that they're paying for shipping of these boxes without someone seeing it, until they open it.
Julie Bornstein (41:15):
And now that Cathy and I aren't there, they need to improve their business. So give it a try.
Cathy Polinsky (41:23):
Exactly. But the other thing is you're talking about the complexity is I think that there is a difference between personalization for consumer companies versus enterprise. And that's also hard when I'm talking about companies like Salesforce and Shopify, is that we looked at ourselves as stewards of data and building platforms for each of these companies, but had to really think about how you can build platforms that can be used on different types of data sets and trained on different data sets without owning the data and intertwining models to each other. And it's a harder problem.
Jess Hertz (42:06):
Yeah, it's a really good point. And I think for Shopify, certainly success is actually facilitating our merchant's success. So we are in a really great position where our incentives are a hundred percent aligned. Shopify doesn't succeed unless its merchants succeed. And from a philosophical standpoint, understanding how you reduce complexity in this space is something that's incredibly important, because you want to set your merchants up extremely well to be able to comply with the vast amount of privacy laws or other types of laws that are in their space. And so trying to create a platform that not only does what we need to do as a company but also facilitates the success and compliance of our merchants is something that I think for a company like Shopify is really the gold standard.
Trishla Ostwal (42:57):
Well, I learned a lot and thank you so much for joining us here today. I hope everyone enjoyed.
Cathy Polinsky (43:04):
Thank you.
Jess Hertz (43:04):
Thank you.