Search engines can be introduced as a best tool for managing, retrieving and extracting important information from a massive set of web data. These engines are scheduled to search the vast web environment and collect countless pages stored in every corner of the web. Search engines providers are always looking for improving the relationship between the results and reducing response times to users, but both of these can be influenced by the automated traffic sent by the bots. This article first defines bots and challenges of detecting them. Then, it provides a method named ‘boof’ for detecting Search robots. In ‘the boof method’, to achieve high accuracy in detecting anomaly robots, many different parameters are used to model the users’ behavior. After determining the priority of parameters in detecting users, decision tree is made and attempted to categorize users into groups of humans, bots, legal bots and the unknown. Robots detected in the decision tree, enable another part of the robot detection system to identify robots even with low request rate. This is done by detecting the botnet behavior pattern. Evaluation of the proposed method on test data shows 97.7 percent accuracy in recognizing users that this improves the accuracy of at least 9, 9 percent compared to the methods examined previously in this area. This is a significant digit that influences decision-making about 2230 users during each day.