本文整理汇总了Java中crawlercommons.robots.BaseRobotsParser类的典型用法代码示例。如果您正苦于以下问题:Java BaseRobotsParser类的具体用法?Java BaseRobotsParser怎么用?Java BaseRobotsParser使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
BaseRobotsParser类属于crawlercommons.robots包,在下文中一共展示了BaseRobotsParser类的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。
示例1: getRobotRules
import crawlercommons.robots.BaseRobotsParser; //导入依赖的package包/类
public static ExtendedRobotRules getRobotRules(String url) {
ExtendedRobotRules robotRules = null;
BaseHttpFetcher fetcher = RobotUtils.createFetcher(new UserAgent("Mandrel", null, null), 1);
BaseRobotsParser parser = new ExtendedRobotRulesParser();
URL robotsTxtUrl = null;
try {
robotsTxtUrl = new URL(url);
} catch (MalformedURLException e) {
log.debug("Can not construct robots.txt url", e);
}
if (robotsTxtUrl != null) {
robotRules = (ExtendedRobotRules) RobotUtils.getRobotRules(fetcher, parser, robotsTxtUrl);
}
return robotRules;
}
示例2: RobotsManagerImpl
import crawlercommons.robots.BaseRobotsParser; //导入依赖的package包/类
public RobotsManagerImpl(BaseHttpFetcher fetcher, BaseRobotsParser parser) {
this.fetcher = fetcher;
this.parser = parser;
}