AutoApply AI is an intelligent system designed to streamline your job search. It automates finding relevant job postings, crafting personalized resumes and cover letters using AI, and assists with application submissions, helping you land your dream job faster.
- π Multi-source Job Search: Search for jobs across LinkedIn, Indeed, Glassdoor, and other platforms.
- π§ Intelligent Job Filtering: Filter jobs based on your skills, experience, location, and custom keywords.
- βοΈ AI-Powered Document Generation: Create tailored resumes and cover letters for each job application using advanced AI models (e.g., Llama 4 Maverick, Llama 3).
- π Automated Application Submission: Assists with or fully automates submitting applications through supported platforms like LinkedIn.
- π Job Match Scoring: Calculates compatibility scores between your profile and job requirements to prioritize applications.
- π Application Tracking: Keep track of all your job applications, their statuses, and follow-up actions in one place.
- π Resume Optimization: Analyzes your existing resume against job descriptions and suggests improvements.
(Placeholder: Consider adding a GIF or screenshots showcasing AutoApply AI in action. For example, a screen recording of the job search, resume generation, or application submission process.)
[-------------------------------------]
| |
| Your Awesome GIF/Image |
| Showcasing the App |
| |
[-------------------------------------]
- AutoApply AI: Smart Job Application Assistant β¨
- π Features
- π¬ Demo / Screenshots
- π οΈ Technology Stack
- π Project Structure
- π Getting Started
- π‘ Usage
- π¨ Customization
- π€ Contributing
- π‘οΈ Safety and Ethics
- π License
- π Acknowledgments
- Core Engine: Python
- Browser Automation:
browser-usefor automating web interactions. - Web Scraping:
Crawl4AIfor intelligent data extraction from job listings. - LinkedIn Integration: Custom integration (potentially using
linkedin-mcp-config.pylogic). - AI-Powered Document Generation: Configurable LLMs (e.g., Llama 4 Maverick, Llama 3) via APIs (GitHub, Groq, OpenRouter) or local
llama_cppsetup. See . - Document Processing:
python-docxanddocxtplfor MS Word documents. - Database: SQLAlchemy for application tracking (see and ).
- Database Migrations: Alembic (see ).
- Configuration Management: Pydantic,
.envfiles, YAML (e.g.,config/config.yaml). - Dependencies: Managed via (and potentially for the broader workspace).
job_application_automation/
βββ .env.example # Example environment variables
βββ README.md # This file
βββ requirements.txt # Python dependencies for this application
βββ alembic.ini # Alembic migration configuration
βββ config/ # Configuration settings
β βββ __init__.py
β βββ browser_config.py # Browser automation settings
β βββ llama_config.py # LLM settings
β βββ ... (other config files)
βββ data/ # Data storage (logs, generated docs, DB, etc.)
β βββ candidate_profile.json # Example candidate profile
β βββ job_applications.db # SQLite database for tracking
β βββ generated_cover_letters/
βββ src/ # Source code
β βββ __init__.py
β βββ main.py # Main application entry point
β βββ smart_apply.py # Core application logic script
β βββ browser_automation.py # Browser interaction logic
β βββ web_scraping.py # Job scraping utilities
β βββ linkedin_integration.py # LinkedIn specific functions
β βββ resume_cover_letter_generator.py # AI document generation
β βββ resume_optimizer.py # Resume analysis and improvement
β βββ application_tracker.py # Tracks job applications
β βββ database.py # Database models and session management
β βββ ... (other modules)
βββ templates/ # Document templates (resume, cover letter)
β βββ resume_template.docx
β βββ cover_letter_template.docx
βββ tests/ # Test cases
βββ ... (other project files)
- Python 3.8 or higher
- Pip (Python package installer)
- Git (for cloning the repository)
- Access to an LLM:
- API key for services like Groq, OpenRouter, OR
- GitHub Personal Access Token for GitHub Models (like Llama 4 Maverick), OR
- A local GGUF model file (e.g., Llama 3, Llama 2) and
llama-cpp-pythoninstalled.
- (Optional) LinkedIn account for features involving direct LinkedIn interaction.
-
Clone the repository:
git clone https://github.com/your-username/your-repo-name.git cd your-repo-name/job_application_automation(Replace
your-username/your-repo-namewith your actual repository URL if applicable, otherwise adjustcdpath if already cloned.) -
Create and activate a virtual environment (recommended):
python -m venv venv # On Windows venv\Scripts\activate # On macOS/Linux source venv/bin/activate
-
Install dependencies:
# From project root pip install -r job_application_automation/requirements.txt(If
llama-cpp-pythonis needed for local models and not inrequirements.txt, you might need to install it separately, potentially with GPU support flags likeCMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python)
-
Set up Environment Variables: Navigate to the
job_application_automationdirectory. Copy the file to.env:cp .env.example .env # For macOS/Linux # copy .env.example .env # For Windows
Edit the
.envfile with your specific configurations. Key variables to set (refer to for details):# --- LLM Configuration --- LLAMA_USE_API=True # Set to False for local model LLAMA_API_PROVIDER="github" # Options: "github", "groq", "openrouter", "local" # If LLAMA_API_PROVIDER="github" GITHUB_TOKEN="your_github_personal_access_token_with_models_scope" LLAMA_API_MODEL="meta/Llama-4-Maverick-17B-128E-Instruct-FP8" # Or other compatible GitHub model # If LLAMA_API_PROVIDER="groq" # GROQ_API_KEY="your_groq_api_key" # LLAMA_API_MODEL="llama3-8b-8192" # Or other Groq model # If LLAMA_API_PROVIDER="openrouter" # OPENROUTER_API_KEY="your_openrouter_api_key" # LLAMA_API_MODEL="meta-llama/llama-3-8b-instruct" # Or other OpenRouter model # If LLAMA_USE_API=False (for local model) # LLAMA_MODEL_PATH="../models/your_local_model_name.gguf" # Adjust path to your model file # LLAMA_USE_GPU=True # Set to False if no compatible GPU # LLAMA_GPU_LAYERS=32 # Adjust based on your GPU capabilities and model (0 for CPU only) # --- LinkedIn Credentials (Optional) --- # LINKEDIN_EMAIL="your_linkedin_email" # LINKEDIN_PASSWORD="your_linkedin_password" # --- Other configurations --- # Review config/ files for more settings like browser paths, etc.
-
Database Setup: The application uses SQLAlchemy and Alembic for database management (tracking job applications). Initialize or upgrade the database schema:
cd job_application_automation && alembic upgrade head
This command should be run from the
job_application_automationdirectory wherealembic.iniis located. This will create/update thejob_applications.dbfile in thejob_application_automation/data/directory. -
Candidate Profile: Create or update your candidate profile in
data/candidate_profile.json. This file is used to personalize resumes and cover letters. An example structure might be:{ "full_name": "Your Name", "email": "your.email@example.com", "phone": "123-456-7890", "linkedin_url": "https://linkedin.com/in/yourprofile", "github_url": "https://github.com/yourusername", "portfolio_url": "https://yourportfolio.com", "summary": "A brief professional summary...", "skills": ["Python", "AI", "Web Scraping", "Project Management"], "experience": [ { "title": "Software Engineer", "company": "Tech Corp", "dates": "Jan 2020 - Present", "description": "Developed amazing things..." } ], "education": [ { "degree": "B.S. in Computer Science", "university": "State University", "year": "2019" } ] }
The primary way to run the application is likely through or .
-
Run the application:
cd job_application_automation python src/cli.py search --keywords "python,ai" --location "Remote"
(Or
python src/main.pyfor the end-to-end flow.) -
Interactive Mode / CLI: The application might offer an interactive command-line interface to:
- Search for jobs.
- Select jobs for application.
- Review and approve generated documents.
- Track application status.
(Refer to the script's help messages or internal documentation for specific commands:
python src/smart_apply.py --help)
You can customize the base MS Word templates used for document generation. These are located in the templates/ directory:
templates/resume_template.docx: Base template for resumes.templates/cover_letter_template.docx: Base template for cover letters.
These templates use Jinja2 syntax (e.g., {{ variable_name }}) for placeholders that the AI will populate. You can modify their structure, formatting, and add or remove placeholders to better suit your personal style. The in shows how different styles/templates can be managed programmatically.
Contributions are highly welcome! Whether it's reporting a bug, proposing a new feature, improving documentation, or writing code, your help is appreciated.
- Fork the repository.
- Create a new branch for your feature or bug fix:
git checkout -b feature/your-amazing-feature
- Make your changes and commit them with clear, descriptive messages.
- Ensure your code passes any existing tests and, if adding new features, include new tests.
- Push your branch to your fork:
git push origin feature/your-amazing-feature
- Open a Pull Request against the
main(ordevelop) branch of the original repository. Please provide a detailed description of your changes.
AutoApply AI is a powerful tool. Please use it responsibly and ethically:
- Review AI-Generated Content: Always carefully review resumes, cover letters, and any application answers generated by the AI before submission. Ensure accuracy, authenticity, and that it truly represents you.
- Respect Rate Limits: Be mindful of the frequency of job searches and application submissions to avoid overloading job portals or APIs. Configure delays if necessary.
- Honest Representation: Do not use this tool to misrepresent your skills, experience, or qualifications. The AI is an assistant, not a replacement for your genuine abilities.
- Adhere to Terms of Service: Respect the terms of service of any job boards (LinkedIn, Indeed, etc.) or platforms interacted with by this tool. Automation may be against the ToS of some platforms.
- Privacy: Be cautious about the personal information you provide (credentials, profile data) and how it's handled by the system and any third-party APIs. Store sensitive data securely.
This project is licensed under the MIT License. See the LICENSE file in the repository for full details.
(If no LICENSE file exists, consider adding one. MIT is a common choice for open-source projects.)
This project stands on the shoulders of giants and leverages many fantastic open-source tools and communities:
- browser-use for robust browser automation.
- Crawl4AI for intelligent web scraping.
- LLM Providers & Libraries (e.g., Llama CPP for local models, Hugging Face, OpenAI, Groq, OpenRouter).
- python-docx-template (docxtpl) for Word document template rendering.
- The Python community and the developers of numerous other libraries used.
This README was enhanced with the help of Trae AI, your agentic AI coding assistant.