Reviewing the webpage metadata

You want to review the webpage metadata for disabled links/scripts in an AUT.

Which of the following choices can be used for this purpose?

Options
  1. Robots
  2. Crawlers
  3. Spiders
  4. All of the given options

Related Posts