Difference between revisions of "Files/BatchStats"

From EPrints Documentation
Jump to: navigation, search
Line 12: Line 12:
  
 
As second you need to create the database and tables used as data warehouse. Use the script
 
As second you need to create the database and tables used as data warehouse. Use the script
schema.eprintstats-ELIS.sql. The name of database must be eprintstats.
+
'schema.eprintstats-ELIS.sql' . The name of database must be eprintstats.
  
Now use the scripts filtra_rob2.pl and input_log_def.pl to insert data from apache log into MySQL
+
Now use the scripts 'filtra_rob2.pl' and 'input_log_def.pl' to insert data from apache log into MySQL.
 
To purge apache log of robots activity you need a list of robots user agents.
 
To purge apache log of robots activity you need a list of robots user agents.
You can use the list avaible in the file 'lista_agenti.dat'
+
You can use the list avaible in the file 'lista_agenti.dat'. To create a similar list you can use the script 'list_robots.pl'.
 +
 
 +
In the script 'update_stat.sh' there are the operation to do every day to update the data warehouse.
 +
 
 +
To create the web pages use the scripts 'const_stat.pm', 'crea_file.pl', 'crea_cumulo.pl',
 +
'crea_country.pl' and the subdir 'bars' and 'flags1814'

Revision as of 16:12, 19 January 2007

To create statistics in a batch way

The code for create statistics in a batch way is here http://eprints.rclis.org/fixsoft/stats.tar.gz .

Until now there isn't a good install manual so, you need to read the code to use the scripts.

A general introduction The package use a specific apache log format. As first thing you need to setup the log with those instructions:

 LogFormat "%h \"%l\" \"%u\" %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" extra
 CustomLog /your_path/logs/access_log extra

As second you need to create the database and tables used as data warehouse. Use the script 'schema.eprintstats-ELIS.sql' . The name of database must be eprintstats.

Now use the scripts 'filtra_rob2.pl' and 'input_log_def.pl' to insert data from apache log into MySQL. To purge apache log of robots activity you need a list of robots user agents. You can use the list avaible in the file 'lista_agenti.dat'. To create a similar list you can use the script 'list_robots.pl'.

In the script 'update_stat.sh' there are the operation to do every day to update the data warehouse.

To create the web pages use the scripts 'const_stat.pm', 'crea_file.pl', 'crea_cumulo.pl', 'crea_country.pl' and the subdir 'bars' and 'flags1814'