gpt4 book ai didi

java - 为什么我有 javax.servlet.UnavailableException : CrawlServlet for my Filter?

转载 作者:行者123 更新时间:2023-11-30 03:54:06 25 4
gpt4 key购买 nike

好的,这就是我正在尝试的。

我在服务器包中创建了一个名为 CrawlServlet 的类。

import java.io.IOException;

import javax.servlet.Filter;
import javax.servlet.FilterChain;
import javax.servlet.FilterConfig;
import javax.servlet.ServletException;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
import javax.servlet.http.HttpServletRequest;


public class CrawlServlet implements Filter{



@Override
public void destroy() {
// TODO Auto-generated method stub

}


@Override
public void doFilter(ServletRequest request, ServletResponse response,
FilterChain chain) throws IOException, ServletException {
// TODO Auto-generated method stub
HttpServletRequest httpRequest = (HttpServletRequest) request;
String requestURI = httpRequest.getRequestURI();
if ((requestURI != null) && (requestURI.contains("_escaped_fragment_"))) {
System.out.println(requestURI);
} else {
try {
// not an _escaped_fragment_ URL, so move up the chain of servlet (filters)
chain.doFilter(request, response);
} catch (ServletException e) {
System.err.println("Servlet exception caught: " + e);
e.printStackTrace();
}
}

}


@Override
public void init(FilterConfig arg0) throws ServletException {
// TODO Auto-generated method stub

}
}

在 lib/web.xml 中,我有

  <filter>
<filter-name>CrawlServlet</filter-name>
<filter-class>CrawlServlet</filter-class>
</filter>


<filter-mapping>
<filter-name>CrawlServlet</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>

运行后,我得到这个错误:

Starting Jetty on port 8888
[WARN]
java.lang.ClassNotFoundException: CrawlServlet
at java.lang.ClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
.....
[WARN] FAILED CrawlServlet: javax.servlet.UnavailableException: CrawlServlet
javax.servlet.UnavailableException: CrawlServlet
....
[ERROR] 503 - GET /Myproject.html?gwt.codesvr=127.0.0.1:9997 (127.0.0.1) 1299 bytes
Request headers
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-AU
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
Host: 127.0.0.1:8888
If-Modified-Since: Wed, 16 Apr 2014 00:35:41 GMT
Connection: keep-alive
Response headers
Cache-Control: must-revalidate,no-cache,no-store
Content-Type: text/html;charset=ISO-8859-1
Content-Length: 1299

出了什么问题?

你能解决这个问题吗?

最佳答案

类的声明

<filter-class>CrawlServlet</filter-class>

必须完全合格。如果 CrawlServlet 位于 server 包中,则需要指定

<filter-class>server.CrawlServlet</filter-class>

或者任何你想要的包。

关于java - 为什么我有 javax.servlet.UnavailableException : CrawlServlet for my Filter?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23691243/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com