1 <?xml version="1.0" encoding="UTF-8"?>
3 Copyright 2002-2004 The Apache Software Foundation or its licensors,
6 Licensed under the Apache License, Version 2.0 (the "License");
7 you may not use this file except in compliance with the License.
8 You may obtain a copy of the License at
10 http://www.apache.org/licenses/LICENSE-2.0
12 Unless required by applicable law or agreed to in writing, software
13 distributed under the License is distributed on an "AS IS" BASIS,
14 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 See the License for the specific language governing permissions and
16 limitations under the License.
18 <!DOCTYPE document PUBLIC "-//APACHE//DTD Documentation V2.0//EN" "http://forrest.apache.org/dtd/document-v20.dtd">
21 <title>Automatic Recording for KiSS Hard Disk Recorders</title>
25 KiSS makes regular updates to their site that sometimes require adaptations
26 to the crawler. If it stops working, check out the most recent version here.
28 <section id="changelog">
29 <title>Changelog</title>
32 <title>24 August 2006</title>
34 <li>The crawler now uses desktop login for crawling. Also, it is much more efficient since
35 it no longer needs to crawl the individual programs. This is because the channel page
36 includes descriptions of programs in javascript popups which can be used by the crawler.
37 The result is a significant reduction of the load on the KiSS EPG site. Also, the delay
38 between requests has been increased to further reduce load on the KiSS EPG site. </li>
40 The crawler now crawls programs for tomorrow instead of for today.
43 The web based crawler is configured to run only between 7pm and 12pm. It used to run at
50 <title>13-20 August 2006</title>
52 There were several changes to the login procedure, requiring modifications to the crawler.
55 <li>The crawler now uses the 'Referer' header field correctly at login.</li>
56 <li>KiSS now uses hidden form fields in their login process which are now also handled correctly by the
61 <section id="overview">
62 <title>Overview</title>
65 In 2005, <a href="site:links/kiss">KiSS</a> introduced the ability
66 to schedule recordings on KiSS hard disk recorder (such as the
67 DP-558) through a web site on the internet. When a new recording is
68 scheduled through the web site, the KiSS recorder finds out about
69 this new recording by polling a server on the internet.
70 This is a really cool feature since it basically allows programming
71 the recorder when away from home.
74 After using this feature for some time now, I started noticing regular
75 patterns. Often you are looking for the same programs and for certain
76 types of programs. So, wouldn't it be nice to have a program
77 do this work for you and automatically record programs and notify you
78 of possibly interesting ones?
81 This is where the KiSS crawler comes in. This is a simple crawler which
82 logs on to the KiSS electronic programme guide web site and gets
83 programme information from there. Then based on that it automatically
84 records programs for you or sends notifications about interesting ones.
87 In its current version, the crawler can be used in two ways:
90 <li><strong>standalone program</strong>: A standalone program run as a scheduled task.</li>
91 <li><strong>web application</strong>: A web application running on a java
92 application server. With this type of use, the crawler also features an automatic retry
93 mechanism in case of failures, as well as a simple web interface. </li>
98 <title>Downloading</title>
101 At this moment, no formal releases have been made and only the latest
102 version can be downloaded.
105 The easy way to start is the
106 <a href="installs/crawler/kiss/kiss-crawler-bin.zip">standalone program binary version</a>
107 or using the <a href="installs/crawler/kissweb/wamblee-crawler-kissweb.war">web
111 The latest source can be obtained from subversion with the
112 URL <code>https://wamblee.org/svn/public/utils</code>. The subversion
113 repository allows read-only access to anyone.
116 The application was developed and tested on SuSE linux 9.1 with JBoss 4.0.2 application
117 server (only required for the web application). It requires at least a Java Virtual Machine
118 1.5 or greater to run.
123 <title>Configuring the crawler</title>
126 The crawler comes with three configuration files:
129 <li><code>crawler.xml</code>: basic crawler configuration
130 tailored to the KiSS electronic programme guide.</li>
131 <li><code>programs.xml</code>: containing a description of which
132 programs must be recorded and which programs are interesting.</li>
133 <li><code>org.wamblee.crawler.properties</code>: Containing a configuration </li>
136 For the standalone program, all configuration files are in the <code>conf</code> directory.
137 For the web application, the properties files is located in the <code>WEB-INF/classes</code>
138 directory of the web application, and <code>crawler.xml</code> and <code>programs.xml</code>
139 are located outside of the web application at a location configured in the properties file.
144 <title>Crawler configuration <code>crawler.xml</code></title>
147 First of all, copy the <code>config.xml.example</code> file
148 to <code>config.xml</code>. After that, edit the first entry of
149 that file and replace <code>user</code> and <code>passwd</code>
150 with your personal user id and password for the KiSS Electronic
156 <title>Program configuration</title>
158 Interesting TV shows are described using <code>program</code>
159 elements. Each <code>program</code> element contains
160 one or more <code>match</code> elements that describe
161 a condition that the interesting program must match.
164 Matching can be done on the following properties of a program:
167 <tr><th>Field name</th>
168 <th>Description</th></tr>
171 <td>Program name</td>
175 <td>Program description</td>
179 <td>Channel name</td>
183 <td>Keywords/classification of the program.</td>
187 The field to match is specified using the <code>field</code>
188 attribute of the <code>match</code> element. If no field name
189 is specified then the program name is matched. Matching is done
190 by converting the field value to lowercase and then doing a
191 perl-like regular expression match of the provided value. As a
192 result, the content of the match element should be specified in
193 lower case otherwise the pattern will never match.
194 If multiple <code>match</code> elements are specified for a
195 given <code>program</code> element, then all matches must
196 apply for a program to be interesting.
204 <th>Example of matching field values</th>
207 <td>the.*x.*files</td>
208 <td>"The X files", "The X-Files: the making of"</td>
212 <td>"Star Trek Voyager", "Star Trek: The next generation"</td>
217 It is possible that different programs cannot be recorded
218 since they overlap. To deal with such conflicts, it is possible
219 to specify a priority using the <code>priority</code> element.
220 Higher values of the priority value mean a higher priority.
221 If two programs have the same priority, then it is (more or less)
222 unspecified which of the two will be recorded, but it will at least
223 record one program. If no priority is specified, then the
228 Since it is not always desirable to try to record every
229 program that matches the criteria, it is also possible to
230 generate notifications for interesting programs only without
231 recording them. This is done by specifying the
232 <code>action</code> alement with the content <code>notify</code>.
233 By default, the <code>action</code> is <code>record</code>.
234 To make the mail reports more readable it is possible to
235 also assign a category to a program for grouping interesting
236 programs. This can be done using the <code>category</code>
237 element. Note that if the <code>action</code> is
238 <code>notify</code>. then the <code>priority</code> element
245 <title>Notification configuration</title>
247 Edit the configuration file <code>org.wamblee.crawler.properties</code>.
248 The properties file is self-explanatory.
257 <title>Installing and running the crawler</title>
260 <title>Standalone application</title>
262 In the binary distribution, execute the
263 <code>run</code> script for your operating system
264 (<code>run.bat</code> for windows, and
265 <code>run.sh</code> for unix).
270 <title>Web application</title>
272 After deploying the web application, navigate to the
273 application in your browser (e.g.
274 <code>http://localhost:8080/wamblee-crawler-kissweb</code>).
275 The screen should show an overview of the last time it ran (if
276 it ran before) as well as a button to run the crawler immediately.
277 Also, the result of the last run can be viewed.
278 The crawler will run automatically every morning at 5 AM local time,
279 and will retry at 1 hour intervals in case of failure to retrieve
280 programme information.
285 <title>Source distribution</title>
287 With the source code, build everything with
288 <code>ant dist-lite</code>, then locate the binary
289 distribution in <code>lib/wamblee/crawler/kiss/kiss-crawler-bin.zip</code>.
290 Then proceed as for the binary distribution.
295 <title>General usage</title>
297 When the crawler runs, it
298 retrieves the programs for tomorrow. As a result, it is advisable
299 to run the program at an early point of the day as a scheduled
300 task (e.g. cron on unix). For the web application this is
301 preconfigured at 5AM.
304 If you deploy the web application today, it will run automatically
305 on the next (!) day. This even holds if you deploy the application
306 before the normal scheduled time.
310 Modifying the program to allow it to investigate tomorrow's
311 programs instead is easy as well but not yet implemented.
318 <section id="examples">
319 <title>Examples</title>
322 The best example is in the distribution itself. It is my personal
323 <code>programs.xml</code> file.
328 <title>Contributing</title>
331 You are always welcome to contribute. If you find a problem just
332 tell me about it and if you have ideas am I always interested to
336 If you are a programmer and have a fix for a bug, just send me a
337 patch and if you are fanatic enough and have ideas, I can also
338 give you write access to the repository.