u/DaKheera47

I built an open-source job search tracker and users have now logged 7 offers through it

I built an open-source job search tracker and users have now logged 7 offers through it

I’ve been building JobOps, an open-source tool for running a job search without everything ending up scattered across tabs, spreadsheets, downloads folders, and inboxes.

It started from my own frustration with job searching. A long search gets messy very quickly. You’re checking multiple job boards, using different search terms, saving roles, tailoring CVs, tracking stages, following up, and trying to remember why you even cared about a role three weeks later.

The last 30 days of usage were interesting.

Users logged around 1,500 job applications.

Around 270 got some kind of response, so roughly 18%, way way above industry average.

I’m counting a response as anything that is not a ghost.

There were also 7 recorded offers.

JobOps is not an auto-apply tool. I don’t like those. The human still chooses the roles and applies manually. The tool is more like a system for the search: collect roles, score fit, tailor applications, track stages, and stop the whole thing from becoming tab chaos

Still early and still rough in places, but it’s cool seeing open-source software actually help people move through this market

u/DaKheera47 — 4 days ago

Who I am

Okay, so I am a CS student in the UK, and I came here from Pakistan in 2022

Since then, I found a placement internship at Autodesk and spent a year doing frontend development there. I now have a return offer with visa sponsorship at the end of my degree, which from what I understand is an extremely challenging achievement in this job market.

The problem I ran into

When I was job hunting for the second time for my graduate position, I realised that I had a bit of a problem at hand.

All of the visa sponsorship specific job boards are paid, and the jobs on generic platforms like LinkedIn and Indeed do not always publish sponsorship data.

So you end up opening a role, reading through it, thinking it might be relevant, and then somewhere near the bottom it says they do not sponsor.

Along with this, with the rise of AI, I realised that companies were posting far more jobs, but most of them were not very good, which gave rise to the popularity of AI auto-application tools.

Why I disagree with auto-apply

I intensely disagree with these auto-apply tools because they apply blindly to any position with no thought whatsoever, and you have very little input into what you are actually submitting.

So what version of you shows up in the recruiter’s hands? You have no control over that.

That feels wrong to me.

The manual workflow I was stuck in

This is where JobOps comes into play.

I realised that I was doing a lot of manual copy-pasting.

Copying the job description from LinkedIn, pasting it into ChatGPT, having it make a custom resume summary for me, then pasting that custom summary into Reactive Resume, which is my resume generator of choice, then exporting that resume, and then filling the application.

And then doing it again.

And again.

And again.

The search problem

Along with this, a big problem was finding jobs in the first place.

Just LinkedIn is not enough and just Indeed is not enough.

I would have to look at all of these boards with multiple search terms every day. Software engineer, software developer, frontend engineer, frontend developer, React developer, TypeScript developer, full stack developer, web developer.

So if you think about that, that is multiple search terms across multiple job boards every single day.

For example:

3 locations × 8 search terms × 10 sources = 240 searches

Before you have even started applying.

What JobOps does

So the part where JobOps comes in is the search part.

You can specify a bunch of search terms, and you can specify a bunch of sources that you want to search, and the app goes out and scrapes them and puts them into one dashboard.

With this, you have now cast a wide net with a lot of jobs that have been found.

But you need to surface the best jobs to the top based on your experience.

So I use AI for that part, using either local AI or some kind of cloud API, and JobOps can score the jobs against your profile so the best ones rise to the top.

What this results in is that one click nets you the equivalent number of jobs from 240 searches, or even more depending on how many sources, search terms, and locations you use, which vastly reduces how much time is spent just searching for jobs.

How CV tailoring works

Once these jobs are found, however, you still need to tailor a CV for them.

And I do not change the entire CV like many other apps claim to do.

I only change the ATS part and the positioning part.

So I change the summary, the keywords at the end, the title that I have under my name, and the short summary right under that, especially the part that proves how I am relevant to the role.

Not a fake version of me.

Just a better-positioned version of the same experience.

The main idea

This is the main crux of the app.

Search, tailor, and apply.

There is a bunch of extra stuff as well that you can find in the documentation, but that is the core idea.

Where it is now

The app is built by me, it is open source, it is self-hostable, and it is now used by over 200 people every day.

There is also a community of around 3,000 people who have starred it on GitHub, and it is used by people literally all over the world from countries I have never even heard of.

A lot of the audience has been from the US, which is nice, but I think Pakistani developers will understand the problem very quickly because our job search is rarely one clean path.

For Pakistani's it is local jobs, remote jobs, Gulf jobs, UK sponsorship, Europe, contractor work, and whatever else might work

GitHub: https://github.com/dakheera47/job-ops/
Hosted version: https://jobops.app/

u/DaKheera47 — 8 days ago

Hello,

While building scrapers for job ops, I realised that there is a lot of repetitive work that I have to do when I am initially scoping out a website to see what kind of protections it has. After building the last few, I realised that I could really optimise this if I automated the steps.

So I made a tiny CLI tool in Python with Codex, that runs through the whole gamut of initial scoping before I implement the scraper itself.

The way it works is that it does an escalating level of checks. For example, it starts with just a basic request, then TLS impersonation, then checking for if any Cloudflare or DataDome cookies are set, just to get a gauge of how challenging a website will be to scrape.

Give it a shot if you want to figure things out and scope things out before you actually build your scrapers!

https://github.com/dakheera47/scraperecon

https://pypi.org/project/scraperecon/

pypi.org
u/DaKheera47 — 11 days ago