|
Post by account_disabled on Dec 19, 2023 23:17:36 GMT -8
You should not use hash links (#) in elements that load content dynamically . JS frameworks like Angular and Vue generate URLs with #, and Google ignores these URLs and doesn't index them. This was highlighted in Google's official position. How do you explain this problem to a developer? In this case, we want to make sure that each page has a unique URL. Using JavaScript and hashed URLs in dynamically loading elements is not a good practice and causes problems with your website's.
visibility in the SERPs. Robots.txt Make sure your robots txt Phone Number Data file does not block JavaScript files. If Google doesn't access the JavaScript files, it won't be able to display your site's content correctly. Therefore, you should check that all folders marked as disallow are not JS files. Example of a robots.txt file blocking a JS files folder: Robots.txt locking the JS files folder You can also use Screaming Frog to find blocked resources:
Blocked Resources Screaming Frog How do you explain this problem to a developer? We want to make sure Google can render your site correctly. To do this, we need to make sure that no important JS files are blocked from indexing in the robots.txt file. GSC Indexing Statistics Report A good place to see how many JavaScript files are indexed on your site is the Indexing Statistics report in GSC . You can find it in settings: GSC Indexing Statistics Report When you open the report.
|
|