You could construct a robots.txt file to allow the robots you want and ban others, trouble is a lot of bad robots, scrapers etc. don't obey the directives.
Useful to go right through server logs and make sure robots are the issue, not someone hotlinking to images for example. If you then want to try and establish the identity of a caller, try a reverse DNS look up somewhere like click here
Depending on your server and the access you have, you may be able to block by IP, quite straightforward for example on an apache server using htaccess click here Equally possible to block by domain.
There are more complex solutions but you need to be careful not to block any people, or robots you don't want to be restricted. As often as not, blocking by IP is effective in this sort of situation. You may need to keep adding a few for a while but should get there in the end.
Should I upgrade to Windows 10? 8 reasons why you should upgrade to Windows 10... and 2 why you…