we are looking for Python and Drupal expert to implement the following project:
1) In one site we have to create a platform to manage crawler (not create them).
Each Crawler will be create a part and saved into this platform. The creation of each crawler for any specific site will be a cost a part. For each of one we need the code source so we could modify it if necessary in the future.
The features of platform are (modify code source of crawler):
Start each crawler every time we want it individually or all
Scheduler the start at specific time and also periodically
Report data extracted with different format excel file, cvs, mysql …
Report data sending through mail or connecting to server or FTP server
THIS PLATFORM HAS TO BE DONE TO BE USE TO ALSO BE USED FOR FURTHER CRAWLERS SUBSEQUENTLY CREATED
THIS IS A PLATFORM OF BACK OFFICE SO THE ACCESS TO IT HAS TO BE PROTECT BY USER AND PASSWORD
THE DESIGN IS NOT IMPORTANT FOR THIS SITE BUT IT HAS TO BE USABLE AND ORDERED TO SAVE OTHER CRAWLERS IN THE FUTURE
2) In the other site we have to create a platform where the user can be access always through user and password to display data. You could use Drupal to create it
Here the design is important. Eventually we can give you a PSD of the site.
So in this platform we have to store data take from crawlers. It would be appropriate that the data take from crawlers is stored automatically in the database of this site. Actually we have to decide when this should be done (so we can do periodically test on data without compromise the database.
The users should be able to decide which data fields display creating a customer report. Choose date period, fields
Besides the platform should do some analysis on data store but not of complicated. Some chart too.
To be clear a sort of Google Analytics. A platform where stored data, display it and manage it.
Specific features (also fields) will be supplied later
Other open jobs by this client
- Fixed-Price – E Commerce Site Responsive