I have always loved technology ever since I have use of memory. I was drawn to computers before I even started going to primary school. That and video games have always been a main driving factor of the things I do. While growing up I've always been super keen on everything new that would come out, and eventually, I decided to study software development.

And I loved it, I loved it oh so much, learning about what made computers tick was such an eye opening and fascinating period of my life. Now mind you, what I studied wasn't big boy university computer science, but a "short" two-year degree that teaches you kinda the bare minimum of software development so you can get thrown into an underpaid job if anything. But in the later half of those two beautiful years, generative AI was starting to become a bit of a thing, but was nothing but a toy that generated dungeon crawler ramblings and images that kinda resembled what you typed in. But boy was I clueless for what would it become.

I ended up landing a job at the beginning of 2020. And, leaving aside some virus some might have heard about, I was really happy. Truth be told, the job was a bigger shoe size I was wearing at the time, and I had to put a lot of extra learning effort out of work hours to try to catch up and become better. And over time I think I proved how I can be a competent software developer. And over time, I was feeling quite happy with what I was achieving in my job, and felt satisfied writing code. But lurking in the shadows was something that I wasn't expecting.

While back in the late 2010s, I did toy around with the generative AI things that were out at the time, when things like ChatGPT came out to the public, and I saw the marketing for the tool, I guess I couldn't but think back to the little toys I used before and think that this would go nowhere, and why would I want to use a tool that's unreliable. "Who would want to use a calculator that cannot calculate an addition accurately?", I told myself. But, then I turned out to be wrong. Not about its unreliability, mind you, but that it'd go nowhere. Suddenly I saw a lot of people starting to use it as a replacement for search engines, as if what it says is the truth because "it's connected to the internet". Have we suddenly forgotten that we need to be careful what we read on the internet is just sometimes not true? And now people are choosing to depend on something that can decide to hallucinate facts at any point, while sounding 100% certain while saying it? I'm not saying its always wrong, but I would rather spend a little extra time doing some research, rather than asking a tool that can potentially be wrong, then do the research anyway in case it's wrong. Why add this extra step?

But, I'm getting derailed. Sure, this tool was gaining popularity, but surely I can just ignore it and keep going with my life, right?

Right?

Oh, I wish.

As always with technology, things don't stop advancing if you stop looking at them. So, eventually, the technology started becoming better at writing software (allegedly). It started rearing its ugly head in my job. Coworkers start mentioning it, and higher ups start seeing the potential, and Copilot licenses are being handed over to "try out". And me, being aware of its cons, I didn't even think of it as a worthwhile thing to try out, and moved on with my tasks. But I kept seeing teammates and other coworkers trying it out and singing its praises and not really understanding why. Maybe I just have a different mindset for this but I couldn't believe what I was hearing. Our jobs don't only involve writing software, but other things that are, in my opinion, much more boring, like attending meetings, writing documentation, among other things. But people here are actively choosing to leave the coding to a machine that could easily introduce bugs and vulnerabilities without you being none the wiser? And sure, you're meant to then review the code it spits out, but I then come back to say why would I use a tool that does a task for me, then check over that it did it correctly, when I can just do the task?

It wouldn't, and, being honest, doesn't make sense to me to this day. And I think I'm starting to realize why it doesn't make sense to me. And I think the reason is quite simple: I like to write code. Shocking, I know, but I like the process of fighting with a programming language and coming up with a solution that fits the problem I'm trying to tackle. It's a puzzle. I love puzzle games and this is kind of like that. It's what gives me joy out of this profession. To quote James Baxter's comment in 's interpolation video1:

Replace stuff that people don't want to do, or can't do. But people get joy (and their livelihood) from creating art, and they want to do it.

I know it might be a bit stupid to compare what I do with art, but I feel like it really echoes the feeling I get out of writing software. It's my own way of solving a problem. The code I write is a part of me. We all have our own style, our own ways of doing things, our own ways of approaching the puzzle. And now, my superiors are pushing me to generate my code and to give AI tools a shot. And I personally don't want to2. Because it would erase the only thing that keeps me going in this job. But, sadly for me, that is not an excuse that will fly. So, I find myself in a position that is both concerning and saddening.

Eventually using AI will stop becoming an option but a must use, and if I want to keep my job I'll have to "give in", but I don't think I can just give up on my beliefs and give up on the only thing that really keeps me working this job. And honestly, I'm not very sure where does that leave me.