| Server IP : 66.29.146.62 / Your IP : 216.73.216.152 Web Server : LiteSpeed System : Linux premium231.web-hosting.com 4.18.0-553.45.1.lve.el8.x86_64 #1 SMP Wed Mar 26 12:08:09 UTC 2025 x86_64 User : dokkdzvi ( 925) PHP Version : 8.1.33 Disable Function : NONE MySQL : OFF | cURL : ON | WGET : ON | Perl : ON | Python : ON | Sudo : OFF | Pkexec : OFF Directory : /opt/alt/python312/lib64/python3.12/urllib/__pycache__/ |
Upload File : |
�
5[Yh�$ � � � d Z ddlZddlZddlZddlZdgZ ej dd� Z G d� d� Z G d� d� Z
G d � d
� Zy)a% robotparser.py
Copyright (C) 2000 Bastian Kleineidam
You can choose between two licenses when using this package:
1) GNU GPLv2
2) PSF license for Python 2.2
The robots.txt Exclusion Protocol is implemented as specified in
http://www.robotstxt.org/norobots-rfc.txt
� N�RobotFileParser�RequestRatezrequests secondsc �Z � e Zd ZdZdd�Zd� Zd� Zd� Zd� Zd� Z d� Z
d � Zd
� Zd� Z
d� Zd
� Zy)r zs This class provides a set of methods to read, parse and answer
questions about a single robots.txt file.
c �z � g | _ g | _ d | _ d| _ d| _ | j |� d| _ y )NFr )�entries�sitemaps�
default_entry�disallow_all� allow_all�set_url�last_checked��self�urls �9/opt/alt/python312/lib64/python3.12/urllib/robotparser.py�__init__zRobotFileParser.__init__ s; � ������
�!���!���������S����� c � � | j S )z�Returns the time the robots.txt file was last fetched.
This is useful for long-running web spiders that need to
check for new robots.txt files periodically.
)r
�r s r �mtimezRobotFileParser.mtime&