Google destroys their own search engine by encouraging terrible SEO nonsense and then offers the solution in the form of these AI overviews, cutting results out of the picture entirely.
You search something on the Web nowadays half the results are written by AI anyway.
I don’t really care about the “human element” or whatever, but AI is such a hype train right now. It’s still early days for the tech, it still hallucinates a lot, and I fundamentally can’t trust it—even if I trusted the people making it, which I don’t.
Google was already going downhill but when they fired Matt Cutts and replaced him with an advertising person was the point where it was obvious they weren’t interested in search anymore.
You will never be able to consistently find the truth when AI optimizes for overfitting to get any result as long as something is show with ads next to it.
AI has no way of understanding truth. It’s autocomplete trained on just anything it can find truth or not.
AI can also understand extra weights for hand picked sources of truth. Whether you then agree with the choices of whoever is doing the hand picking, is a separate matter.
Google destroys their own search engine by encouraging terrible SEO nonsense and then offers the solution in the form of these AI overviews, cutting results out of the picture entirely.
You search something on the Web nowadays half the results are written by AI anyway.
I don’t really care about the “human element” or whatever, but AI is such a hype train right now. It’s still early days for the tech, it still hallucinates a lot, and I fundamentally can’t trust it—even if I trusted the people making it, which I don’t.
Google was already going downhill but when they fired Matt Cutts and replaced him with an advertising person was the point where it was obvious they weren’t interested in search anymore.
You will never be able to consistently find the truth when AI optimizes for overfitting to get any result as long as something is show with ads next to it.
AI has no way of understanding truth. It’s autocomplete trained on just anything it can find truth or not.
AI can also understand extra weights for hand picked sources of truth. Whether you then agree with the choices of whoever is doing the hand picking, is a separate matter.
Would be a shame if different “truths” made more or less money and someone optimized for one of those.
Yeah, they call it A-Z testing, because A/B wasn’t enough, and AIs can fill A to Z cases with ease.