There’s a term used in tech called “empire building”. It’s where managers and execs promote their little slice of the company to persevere and grow their own career. At a certain level, it leads to someone that leads a division like AI having enough influence that they can say “let’s put AI into search”.
The sad thing about tech is that at a certain level, an executive rises above the customer in dictating what is best for a product. Data and stats can tell you whatever story you want to promote, so at Google HQ they’re probably worried about the negative press, but they’re looking at “successful” numbers of questions answered by AI and are patting themselves on the back. Both search and AI execs look good because they delivered something, and they’ll likely get a nice bump from their bosses in terms of rep.
The thing with empires is that they fall. Not overnight, and maybe not with the same emperor, but they do fall.
Data and stats can tell you whatever story you want to promote
Seen this so many times at my work. There’s some bone-headed decision and the people in charge are like “look guys we ran the numbers”. But the methodology is messed up somehow, or they just ignored / misinterpreted the numbers while pretending they were following the data, or it doesn’t bear out in the real world; etc.
When data and common sense disagree you’d better be damn sure in the data.
I can kinda answer that.
There’s a term used in tech called “empire building”. It’s where managers and execs promote their little slice of the company to persevere and grow their own career. At a certain level, it leads to someone that leads a division like AI having enough influence that they can say “let’s put AI into search”.
The sad thing about tech is that at a certain level, an executive rises above the customer in dictating what is best for a product. Data and stats can tell you whatever story you want to promote, so at Google HQ they’re probably worried about the negative press, but they’re looking at “successful” numbers of questions answered by AI and are patting themselves on the back. Both search and AI execs look good because they delivered something, and they’ll likely get a nice bump from their bosses in terms of rep.
The thing with empires is that they fall. Not overnight, and maybe not with the same emperor, but they do fall.
Seen this so many times at my work. There’s some bone-headed decision and the people in charge are like “look guys we ran the numbers”. But the methodology is messed up somehow, or they just ignored / misinterpreted the numbers while pretending they were following the data, or it doesn’t bear out in the real world; etc.
When data and common sense disagree you’d better be damn sure in the data.
It’s called “data chauffered”: instead of following the data, tell it where it needs to take you
driven data design