Posted in 2021

Analyzing LinkedIn's data export: what happened in 2021?

I鈥檝e been using LinkedIn basically since I started working as an intern back in 2012. My usage is mostly limited to posting my blog posts, except the couple of times I used the platform to search for a new job. So most of the time, LinkedIn has been pretty slow-paced, with maybe half a dozen random recruiters reaching out per year.

However, since the Covid-19 pandemic started, and particularly in 2021, things seem to have gone a little crazy, with a lot more recruiter activity. I was curious to see just how much things had changed, so I looked at LinkedIn鈥檚 data export.

Botched interviews

Here鈥檚 something I鈥檝e been wanting to write for a while: all the times (the ones I can remember, anyway) I bombed a software engineer job interview. There are so many 鈥渉ow I aced interviewing at X鈥/鈥漢ow to pass X interview鈥 floating around that I thought the opposite story would make for an amusing read.

My first developer job was as an intern at a big tech company in 2012. I think that was one of the worst interviews I鈥檝e had, by the way 鈥 I could barely understand the interviewer over the cellphone, and those were the days of 鈥渉ow many piano players are there in New York鈥-kind of questions. I thought it went terrible, but I got the job somehow. On the other hand I鈥檝e had many interviews I thought I did great but bombed anyway.

Efficient resource distribution

TLDR A simple metrics-based ranking system is good enough to decide who gets how many resources.

Computational resources 鈥 CPU time, memory usage, network traffic etc 鈥 are limited. This may be more or less of a problem depending on project/company size and so on; if you鈥檙e working on a smaller product with limited traffic, it might not be meaningful at all.

Once past a certain threshold though, expenses with such resources become non-trivial and it begins to make sense to spend some time thinking about how to distribute them as efficiently as possible.

Here鈥檚 the problem that got me thinking about this: at work, we had a computational resource that needed to be consumed by a large fleet of workers (think several thousand concurrent), but each type of worker had different productivity, and that productivity changed over time. How can we decide who gets what?

Onsites considered harmful

A couple of years ago I interviewed at one of the largest Ruby shops out there. Screening went well, and some days later I was invited for an onsite.

These were the good old pre-covid days, so an onsite really meant onsite. You had to travel to the office, wherever that was.

The thing is, an onsite is actually radically different depending on where you live. It follows that onsites introduce further bias into our industry鈥檚 already problematic hiring process. I鈥檇 like to argue that although onsites have some advantages, they鈥檙e mostly a waste of time (and money).