SkillHunter is all about helping you and your mates identify the most in-demand skills in the job market in order to find a new position as soon as possible. Cut the bullshit and prepare yourself in the most efficient way, no need to learn everything!
What if you asked yourself the wrong question: "What do I need to learn?" Whereas a better one could be: "What I don't need to learn?" The latter helps you save a ton of time by avoiding unnecessary work when time is the most valuable – at the very beginning of unknown – that's where the SkillHunter really shines.
Download this project and run Docker Compose on your machine:
git clone https://github.com/spyker77/skillhunter.git
cd skillhunter
Update environment variables inside the docker-compose.yml and run the following bash commands inside downloaded project's folder – this will launch the process of building the image (if it doesn't exist), create and start containers in a detached mode.
- Due to a forced HTTPS in production, it might be a good idea to start with ENVIRONMENT=development in .env file – this will allow you to avoid SSL related errors.
docker compose up -d
- On the first run you need to apply migrations to the fresh database:
docker compose exec web python manage.py migrate
- Load the job titles to parse and skills to identify:
docker compose exec web python manage.py loaddata jobs.json skills.json
- [Optional] Run scrapers manually to collect data about available vacancies:
docker compose exec worker python manage.py scrape_hh
docker compose exec worker python manage.py scrape_indeed
docker compose exec worker python manage.py scrape_sh
Or rely on the Celery beat to run this and other tasks periodically – see the CELERY_BEAT_SCHEDULE
setting.
Tada 🎉
By now you should be up and running. Try to reach the http://localhost in your browser.
In order to prepopulate the database, you can use the test data:
docker compose exec web python manage.py loaddata scrapers_vacancy.json scrapers_vacancy_part_1.json scrapers_vacancy_part_2.json scrapers_vacancy_part_3.json
In order to run tests:
docker compose exec -e DB_HOST=db web pytest -n auto --cov="."
Note
For the local development you may need to install pg_config for the psycopg (e.g. using brew install postgresql
), and Node.js in case you want to update the packages for a production build of CSS.
Pull requests are really welcome. For major changes, please open an issue first to discuss what you would like to change.
Also, make sure to update tests as appropriate 🙏
This project is licensed under the terms of the MIT license.