In the video, he reminded viewers that Google updated the crawler it uses to find public, crawlable pages. They had been using a special, older version of Chrome for Googlebot, but now they have started using the newest version of Chrome to make sure they can see what users actually see when they view pages in their web browser. Googlebot now runs on what they call an “evergreen” version of Chromium, which basically just means they’re always using the latest version of it, similar to how PC users keep their web browsers constantly updated.
He also explained that it doesn’t make a lot of sense to submit a URL that redirects for indexing because the redirect tells Googlebot the website administrator would rather have the other URL indexed. Thus, he recommends that webmasters simply submit the URL they want to have indexed or, better yet, make sure Googlebot is able to discover the URL automatically. He added that if the URL is properly linked inside the website, Google will automatically discover it through regular crawling.
We have embedded the YouTube video here in its entirety: