Page experience concerns because of crawler detection #889
ChrisRadny
started this conversation in
General
Replies: 1 comment
-
As of my knowledge |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi.
I am not using lazysizes. I am using b-lazy code with an own integration of intersection observer. My code is from 2018, and I found some issues when google is rendering a page. Google is only rendering ~500 px in height. I thought the initial viewport is 9000 px in height. But it seems that google changed the initial viewport between 2018 und 2020. The issue is that googlebot won't see the images below the initial viewport with my code.
I searched for modern lazy loading JS scripts. Every script solves my problem with bot detection and changes data-src to src, so googlebot can pick up all images.
But I really have concerns in terms of pagespeed experience. Googlebot has to load all images onload. But that's not correct. This will break page experience.
At the moment I am playing around with loading="lazy" attributes in the img tag. Bot detection changes all data-src attributes to src, but loading="lazy" defers loading of images.
What do you think?
Thanks Chris
Beta Was this translation helpful? Give feedback.
All reactions