Dear valued users,
First and foremost, Boost would like to thank you for your patience and support during the last few days. We are pleased to announce that the recent issues affecting our system have been resolved. Our team has been working diligently to investigate and fix said issues, and we can confirm that the system is now functioning as intended.
Here’s our detailed report on it.
On March 29, 2023, some of our users experienced indexing issues reported by Google Search Console as follows:
- “No Index”, and “No Follow” Tags Prevented Indexing for Collection Pages in some stores
- Collection Pages Indexed but Blocked by Robots.txt
What we did to investigate the issues
After carefully inspecting & fixing the incident, our team would like to share how we investigated and found the root cause of the above issues:
- When checking the page URLs provided by customers, we found that Shopify shows the pages correctly on normal web browsers. However, testing with Google Bots always shows the pages with the “?view=boost-pfs-original” code.
We contacted Shopify for further investigation as we thought Shopify might showcase different page content because of Google Bots.
- While waiting for the response from Shopify, we continued to look further and, we concluded that Google Bot was redirected. It couldn't fetch our API despite our API running normally as we had tested it through different methods.
- We discovered that Google Bot shows errors when calling API requests to our servers. However, our servers did not register requests in our log. We worked with the AWS team to investigate the infrastructure and browse through historical logs to locate error requests but could not find them.
- We checked and found a similar use case reported in Google Bot 3 days ago. It responds normally to normal requests, but the Google Bot cannot request. A solution from the said case suggests reviewing the robots.txt file as it seems the new Google update (rolled out from March 15, 2023) causes the issue. Google Bots send requests to robots.txt of the cross-domain first before calling the API request. Although the API URL works fine and responds successfully, if the API domain doesn’t include the robots.txt file then Google Bots don’t call the API URL.
- We checked our services and found that recently, we deployed several updates to improve app performance. These include routing part of requests to the new services as well as the old services. However, the file robots.txt was missing during the updating process. That explains why Google Bots didn’t make requests if it didn’t find the robots.txt regardless of the API of the new service running OK. So the collection pages are all indexed but the Google bots couldn't crawl the data.
- Besides, we set up a fall-back mechanism in our app to back up users’ stores in case of API errors. When there is an error of API calling after 2 times of trying, our app redirects the collection page URL to the URL that includes “boost-pfs-original”. To avoid duplicating content which is harmful to the Google ranking of customers' stores, we implement the “no index” and “no follow” tags for the page URL with “boost-pfs-original”. When the backup measure is triggered, the “no index” and “no follow” tags instruct Google not to index, causing the issue reported by some users on March 29th,2023.
The issues have been fixed
We have fixed the issues with the new server services and implemented a dashboard to monitor the metrics and stats and alerts from Google Bots. We are also visiting our API to prevent incidents like this.
If your store has URLs affected by the issues, we suggest using Google Search Console to request recrawls of said pages. If the problem persists, please contact us.
The recent Google Bot Crawl issues, though resolved quickly, show that there’s still room for us to grow and improve:
- We will increase system monitoring frequency. Our team is implementing more frequent system checks and developing advanced tools to ensure that potential issues are identified and addressed as quickly as possible.
- We will also improve workflows for dealing with unexpected issues. This involves developing more streamlined processes for identifying and reporting issues.
- Review our infrastructure and API to detect and eliminate potential threats.
We take the reliability and performance of our system seriously, and we are committed to ensuring that our users have a seamless experience when using our service.
If you experience any issues or have any questions, please do not hesitate to reach out to our support team. We are always here to help and are committed to providing the best possible experience to our users.
Thank you again for your understanding and support as we worked to resolve this issue.