I am often surprised by the reticence of 'hackers' to integrate AI tooling and to interact with AI based systems. There is an adherence to the old ways, the traditions and the 'this is how we've always done it' mindset around AI that is unusual and not what I would expect of a culture and community usually dedicated to uprooting the status quo, often at the intersection of cutting edge technology.
Where once I would need to go through stack overflow to understand some arcane syntax of an API, and scroll through posts where others have stumbled in the same way, instead I can ask it to ChatGPT. A concrete example of this is working with BigQuery, the Google-based data warehousing system which my startup is using. I am familiar with SQL and familiar with cloud bindings and familiar with GCP, however the exact syntax and process of interfacing with BigQuery has a few quirks, and a few missing functions. Rather than having to pore through the documentation, and read through few and far between forums to debug some strange error, Cursor's interface to o3-mini is usually able to suggest immediately where I have gone wrong and how I need to fix it. It solves my problem directly, but it has a second order effect in that I don't really care about the BigQuery API syntax - this isn't interesting to me. I'm interested in interfacing wtih my data warehouse but I don't care to learn about the implementation - AI can take this entire mental load off my plate.
I also read a lot of forum posts suggesting that AI isn't useful for obscure programming languages and extremely complex tasks and processes. This is probably true, but I wonder what these people's jobs are that their workflows never involve interacting with legacy code or strapping together frontends and prototypes - boiler plate tech that is AI bread and butter. Do they already outsource all this work to junior (cheaper?) developers?
One of my favourite quotes for the utility of AI programming is 'It is possible to be be precise at an higher level of abstraction as long as your prompts are consistent with a coherent model of the code.' Using natural language to interface with a computer is a new paradigm - and it is not yet clear what it's limitations and best practices are. If you are uninterested in this and clinging to the old ways then you will be left behind by those who adopt.
I am often surprised by the reticence of 'hackers' to integrate AI tooling and to interact with AI based systems. There is an adherence to the old ways, the traditions and the 'this is how we've always done it' mindset around AI that is unusual and not what I would expect of a culture and community usually dedicated to uprooting the status quo, often at the intersection of cutting edge technology.
Where once I would need to go through stack overflow to understand some arcane syntax of an API, and scroll through posts where others have stumbled in the same way, instead I can ask it to ChatGPT. A concrete example of this is working with BigQuery, the Google-based data warehousing system which my startup is using. I am familiar with SQL and familiar with cloud bindings and familiar with GCP, however the exact syntax and process of interfacing with BigQuery has a few quirks, and a few missing functions. Rather than having to pore through the documentation, and read through few and far between forums to debug some strange error, Cursor's interface to o3-mini is usually able to suggest immediately where I have gone wrong and how I need to fix it. It solves my problem directly, but it has a second order effect in that I don't really care about the BigQuery API syntax - this isn't interesting to me. I'm interested in interfacing wtih my data warehouse but I don't care to learn about the implementation - AI can take this entire mental load off my plate.
I also read a lot of forum posts suggesting that AI isn't useful for obscure programming languages and extremely complex tasks and processes. This is probably true, but I wonder what these people's jobs are that their workflows never involve interacting with legacy code or strapping together frontends and prototypes - boiler plate tech that is AI bread and butter. Do they already outsource all this work to junior (cheaper?) developers?
One of my favourite quotes for the utility of AI programming is 'It is possible to be be precise at an higher level of abstraction as long as your prompts are consistent with a coherent model of the code.' Using natural language to interface with a computer is a new paradigm - and it is not yet clear what it's limitations and best practices are. If you are uninterested in this and clinging to the old ways then you will be left behind by those who adopt.