I noticed Google started showing those AI Answers on my searches last week. These answer boxes pop up for questions both on phones and computers. They made me laugh before when they pulled weird stuff from Reddit Jokers. But have these AI summaries improved since coming to UK users? What might happen to websites because of them?
The current design shows a small answer that you can expand by clicking. When you do that, links appear on the right side, showing where Google found each piece of information. You can also click special icons next to parts of the answer to see exactly which website provided that specific detail.
My curiosity led me to check which searches make these AI boxes appear and which websites Google trusts enough to include. Without fancy tools, I manually tested 50 different search questions plus hundreds more casual checks. During my research, interesting patterns emerged across different types of searches.
Food recipes rarely show AI answers. Neither do travel questions about places to visit or attractions. Questions about specific famous people usually skip AI answers - asking Taylor Swift's height or Elon Musk's children won't trigger them. But home improvement, gardening, and crafts almost always display AI summaries.
Health topics frequently show AI boxes despite Google adding a small warning that says this information shouldn't replace medical advice. The system avoids harmful questions like extreme weight loss methods. Finance questions sometimes display AI answers for general topics but skip specific product recommendations like bank accounts or mortgage rates.
Car-related searches split down the middle - fixing scratched wheels, yes; changing tires, no. Shopping searches almost never show AI summaries, especially when looking for the "best" products. The patterns suggest that Google avoids giving AI answers for anything labeled "top" or "best" something.
When examining searches that did trigger AI answers, I checked three things. Were the sources relevant? Was the information correct? Did Google favor already popular websites? Most sources fit the topic, but occasional strange choices appeared, like linking product pages instead of information articles.
Some weird examples included health answers that ignored trusted medical sites in favor of fitness clothing websites. Another cited YouTube comments suggesting cleaning vinyl records with sandpaper! The music artist with the most Spotify listeners was wrong. One answer about wasps made killing them with gasoline sound reasonable until you clicked through and saw the actual website strongly warned against this dangerous method.
High-ranking websites dominated the sources Google used. My counts showed that websites in position one appeared 31 times in AI answers, position two appeared 29 times, and the numbers decreased for lower positions. This matches what bigger studies found—websites already on page one supply 99.5% of AI answer content.
Overall accuracy has improved greatly compared to earlier versions that suggested putting glue on pizza cheese. However, smaller websites might occasionally appear in answers for unusual questions but probably won't receive many clicks because people read the answer without visiting the source. This may reduce website traffic, even for highly ranked pages.
The research suggests website owners should continue focusing on standard search techniques rather than trying special "AI optimization" tricks. The sites Google already trusts for regular search results become the same ones it uses for AI summaries. Your best strategy remains creating excellent content that ranks normally since those top positions supply almost all AI answer material.
The current design shows a small answer that you can expand by clicking. When you do that, links appear on the right side, showing where Google found each piece of information. You can also click special icons next to parts of the answer to see exactly which website provided that specific detail.
My curiosity led me to check which searches make these AI boxes appear and which websites Google trusts enough to include. Without fancy tools, I manually tested 50 different search questions plus hundreds more casual checks. During my research, interesting patterns emerged across different types of searches.
Food recipes rarely show AI answers. Neither do travel questions about places to visit or attractions. Questions about specific famous people usually skip AI answers - asking Taylor Swift's height or Elon Musk's children won't trigger them. But home improvement, gardening, and crafts almost always display AI summaries.
Health topics frequently show AI boxes despite Google adding a small warning that says this information shouldn't replace medical advice. The system avoids harmful questions like extreme weight loss methods. Finance questions sometimes display AI answers for general topics but skip specific product recommendations like bank accounts or mortgage rates.
Car-related searches split down the middle - fixing scratched wheels, yes; changing tires, no. Shopping searches almost never show AI summaries, especially when looking for the "best" products. The patterns suggest that Google avoids giving AI answers for anything labeled "top" or "best" something.
When examining searches that did trigger AI answers, I checked three things. Were the sources relevant? Was the information correct? Did Google favor already popular websites? Most sources fit the topic, but occasional strange choices appeared, like linking product pages instead of information articles.
Some weird examples included health answers that ignored trusted medical sites in favor of fitness clothing websites. Another cited YouTube comments suggesting cleaning vinyl records with sandpaper! The music artist with the most Spotify listeners was wrong. One answer about wasps made killing them with gasoline sound reasonable until you clicked through and saw the actual website strongly warned against this dangerous method.
High-ranking websites dominated the sources Google used. My counts showed that websites in position one appeared 31 times in AI answers, position two appeared 29 times, and the numbers decreased for lower positions. This matches what bigger studies found—websites already on page one supply 99.5% of AI answer content.
Overall accuracy has improved greatly compared to earlier versions that suggested putting glue on pizza cheese. However, smaller websites might occasionally appear in answers for unusual questions but probably won't receive many clicks because people read the answer without visiting the source. This may reduce website traffic, even for highly ranked pages.
The research suggests website owners should continue focusing on standard search techniques rather than trying special "AI optimization" tricks. The sites Google already trusts for regular search results become the same ones it uses for AI summaries. Your best strategy remains creating excellent content that ranks normally since those top positions supply almost all AI answer material.