Blocking /css and /js makes sense because they don't add value in being crawled. Not necessary, but doesn't hurt.
Best practice: Always have a robots.txt file, even if it's empty.
Blocking /css and /js makes sense because they don't add value in being crawled. Not necessary, but doesn't hurt.
Best practice: Always have a robots.txt file, even if it's empty.