Google on Tuesday announced support for third-party tools in Gemini Code Assist, its enterprise-focused AI code completion service. Code Assist launched in April as a rebrand of aΒ similar serviceΒ Google offered under its now-defunct Duet AI branding. Available through plug-ins for popular dev environments like VS Code and JetBrains, Code Assist is powered by Googleβs Gemini [β¦]
They said that while AI could boost productivity, stakeholders should understand its limitations.
This article is part of "CXO AI Playbook" β straight talk from business leaders on how they're testing and using AI.
At a Business Insider roundtable in November, Neeraj Verma, the head of applied AI at Nice, argued that generative AI "makes a good developer better and a worse developer worse."
He added that some companies expect employees to be able to use AI to create a webpage or HTML file and simply copy and paste solutions into their code. "Right now," he said, "they're expecting that everybody's a developer."
During the virtual event, software developers from companies such as Meta, Slack, Amazon, Slalom, and more discussed how AI influenced their roles and career paths.
They said that while AI could help with tasks like writing routine code and translating ideas between programming languages, foundational coding skills are necessary to use the AI tools effectively. Communicating these realities to nontech stakeholders is a primary challenge for many software developers.
Understanding limitations
Coding is just one part of a developer's job. As AI adoption surges, testing and quality assurance may become more important for verifying the accuracy of AI-generated work. The US Bureau of Labor Statistics projects that the number of software developers, quality-assurance analysts, and testers will grow by 17% in the next decade.
Expectations for productivity can overshadow concerns about AI ethics and security.
"Interacting with ChatGPT or Cloud AI is so easy and natural that it can be surprising how hard it is to control AI behavior," Igor Ostrovsky, a cofounder of Augment, said during the roundtable. "It is actually very difficult to, and there's a lot of risk in, trying to get AI to behave in a way that consistently gives you a delightful user experience that people expect."
For many developers, managing stakeholders' expectations β communicating the limits, risks, and overlooked aspects of the technology β is a challenging yet crucial part of the job.
Kesha Williams, the head of enterprise architecture and engineering at Slalom, said in the roundtable that one way to bridge this conversation with stakeholders is to outline specific use cases for AI. Focusing on the technology's applications could highlight potential pitfalls while keeping an eye on the big picture.
"Good developers understand how to write good code and how good code integrates into projects," Verma said. "ChatGPT is just another tool to help write some of the code that fits into the project."
Ostrovsky predicted that the ways employees engage with AI would change over the years. In the age of rapidly evolving technology, he said, developers will need to have a "desire to adapt and learn and have the ability to solve hard problems."
AWS said developers spend most of their time on non-coding tasks, impacting productivity.
It introduced Amazon Q Developer β an AI agent to aid developers β at the re:Invent keynote on Tuesday.
But junior engineers are concerned AI tools like Amazon Q could reduce coder demand.
Artificial intelligence could give coders more time to code. Programmers aren't sure whether that's a good thing.
In a post Tuesday, Amazon Web Services said developers report spending an average of "just one hour per day" on actual coding.
The rest is eaten up by "tedious, undifferentiated tasks," AWS said. That includes learning codebases, drafting documents, testing, overseeing releases, fixing problems, or hunting down vulnerabilities, AWS said. The company didn't say where it got the data.
AWS CEO Matt Garman spoke to the developers in the audience at the company's re:Invent keynote on Tuesday, introducing a tool he said would give them more time to focus on creativity. Amazon Q Developer is an AI agent that AWS is rolling out in two tiers with free and paid options.
The announcement is another indication that technology like AI could upend the way many coders do their jobs. Some have argued that AI will remove some of the tedium from tasks like creating documentation and generating basic code. That could be great for coders' productivity β and perhaps for their enjoyment of the jobs β yet it could also mean employers need fewer of them.
GitLab has reported that developers spend more than 75% of their time on tasks other than coding. Several veteran software engineers previously told BI that the time they spend coding is perhaps closer to half.
Software engineers on job forums like Blind are discussing how much they should rely on an AI assistant for their work. Some have asked for recommendations for the best agent, and receive mixed replies of "your own brain" and genuine reviews. Others worry that AI has already become a crutch in their coding process.
AWS isn't the only tech giant offering AI to coders. Google CEO Sundar Pichai recently said that AI generates more than a quarter of the new code created at the search company. He said the technology was "boosting productivity and efficiency." Workers review the code that AI produces, Pichai said.
"This helps our engineers do more and move faster," he said. "I'm energized by our progress and the opportunities ahead, and we continue to be laser-focused on building great products."
The rise of AI could be worrisome for newbie programmers who need to develop their skills, according to Jesal Gadhia, head of engineering at Thoughtful AI, which creates AI tools for healthcare providers.
"Junior engineers," Gadhia previously told BI, "have a little bit of a target behind their back."
He said that when an AI tool touted as the "first AI software engineer" came out this year, he received texts from nervous friends.
"There was a lot of panic. I had a lot of friends of mine who messaged me and said, 'Hey, am I going to lose my job?'" Gadhia said.
Over the past two years, employees have expressed repeatedly that they are fed up with being asked to do too much.
Tough luck. The latest catchphrase to describe working less is "ghost engineer" β and it comes not from burnt-out employees but from a Stanford researcher whose team has developed an algorithm to help tech companies identify freeloading coders.
Stanford researcher and former Olympic-level weightlifter Yegor Denisov-Blanch ran the algorithm, which grades the quality and quantity of employees' code repositories on GitHub, on the work of more than 50,000 employees across hundreds of companies.
Roughly 9.5%, he found, "do virtually nothing."
Measuring output is difficult
Denisov-Blanch calls these workers "ghost engineers," defined as software engineers who are only 10% as productive or less than their median colleague.
His research began as an attempt to find a better way to grade the performance of software engineers, he said in an interview with Business Insider.
"Software engineering is a black box," Denisov-Blanch said. "Nobody knows how to measure software engineers' performance. Existing measures are unreliable because they rate equal work differently."
"It's not fair when someone's doing a very complicated change that's only one line of code. And the person doing the very simple change that's 1,000 lines gets rewarded," he continued.
His algorithm attempts to resolve that tension, giving high ratings to engineers who write many lines of code only so long as that code is maintainable, solves complex problems, and is easy to implement.
Denisov-Blanch's research has not been peer-reviewed.
There are other caveats. Industry-wide, the 9.5% figure could be an overstatement because Denisov-Blanch's research team ran the algorithm only on companies that volunteered to participate in the study, introducing selection bias.
Conversely, while Denisov-Blanch's team didn't classify employees whose output is only 11% or 12% of the median engineer's as "ghost engineers," there's a strong argument that those employees aren't contributing much either, which could mean the 9.5% figure is an understatement.
Why does this matter?
Itβs insane that ~9.5% of software engineers do almost nothing while collecting paychecks.
This unfairly burdens teams, wastes company resources, blocks jobs for others, and limits humanityβs progress.
Rooting out underperformers has lately become something of a mania among some in Silicon Valley.
In September, Y Combinator co-founder Paul Graham published an essay lauding a management style he called "founder mode," which he distinguished from the conventional wisdom of, in his words, "hire good people and give them room to do their jobs."
"In practice, judging from the report of founder after founder, what this often turns out to mean is: hire professional fakers and let them drive the company into the ground," Graham wrote.
Heading the charge has been Elon Musk, who has spoken proudly about firing 80% of Twitter's employees after buying the company in 2022. Twitter, now X, didn't appear to experience significant outages or service interruptions following the staff reduction.
"Were there many mistakes along the way? Of course. But all's well that ends well," he told CNN. "This is not a caring-uncaring situation. It's like, if the whole ship sinks, nobody's got a job."
More remote workers were superstar coders
Musk now aims to apply that same ruthless efficiency to the federal government. As co-chair of a new Department of Government Efficiency, he pledged in a Wall Street Journal op-ed to slash federal staffing, including by ending remote work to spur resignations.
"If federal employees don't want to show up, American taxpayers shouldn't pay them for the Covid-era privilege of staying home," Musk wrote.
Denisov-Blanche's research showed mixed results for remote work. On one hand, he found that the prevalence of "ghost engineers" among remote workers was more than double that among in-person workers.
But he also found that many more of the most effective engineers β employees whose performance was at least five times better than their median colleague β were working remotely than were in-person.
Coding languages are a foundational element of any tech job, but not all are made equal.
Python and SQL are among the most popular languages; C++ and Tableau are more specialized.
Business Insider spoke with recruiters and tech workers to identify the top eight languages to know.
Big Tech firms like Apple and Amazon have signaled a move away from the complicated coding language C++, but there's still a place for engineers who know the coding language on Wall Street.
Apple created Swift to replace its use of C++, the company's primary coding language for its devices. Amazon recently awarded a Stony Brook University professor an approximate $100,000 research grant to continue his work to automate converting existing C++ code to Rust, a coding language created in 2006. Even the White House has joined the conversation around C++, urging software developers to move away from the language due to cybersecurity concerns in a February report earlier this year.
But the financial space is still "one of the heavy users of C++ that's really doing cutting-edge stuff," one industry executive told Business Insider. High-frequency trading firms and exchanges rely on C++, a notoriously complicated language that can offer more control over the underlying hardware. It's also prevalent in the video game industry.
Citadel Securities, for one, recently hired a C++ expert from Microsoft to lead training initiatives on the language. Looking at current open technology positions, trading firms Virtu Financial and Hudson River Trading are among the firms also seeking out C++ experience.
It's also good to know in quantitative finance, one of the few bright spots in the current technology hiring slump, Matt Stabile, a tech recruiter who works with buy-side firms including Two Sigma and Susquehanna International Group, told BI.
In today's machine-to-machine world, having some experience with programming languages is a must. Coding languages, like Python and Java, are how humans can communicate with computers by providing a set of instructions for a system to execute. As it turns out, not all programming languages are made equal and some are more relevant to certain corners of Wall Street than others.
Business Insider spoke with recruiters, Wall Street tech execs, and industry insiders and analyzed job postings to learn about in-demand skill sets.
Here are the programming languages to know:
Python
Areas of interest: Applicable across finance firms, job titles, and levels
Firms using it: Banks, hedge funds, and investment firms
As the fundamental language for engineering work across Wall Street, Python has long been at the top of the skills list for buy- and sell-side firms alike. It has been a favorite at Capital One and Man Group.
From visualization to statistical analyses to modeling and machine-learning applications, Python has multiple use cases. It also lends itself to those who don't have deep coding backgrounds because it is flexible and applies to a wide range of users, Ori Ben-Akiva, director of portfolio management at Man Numeric, a quantitative-focused division of the publicly traded hedge fund Man Group, previously told BI.
When it comes to data science and machine learning roles, "Python is king of the road," said Stabile, who runs his own recruiting shop called Stabile Search.
SQL
Areas of interest: Anyone who works with databases, data
Firms using it: Almost every financial firm
As data becomes more centralized in financial firms' strategies β from marketing to identifying new deal opportunities and analyzing risk β it's helpful to know SQL, which is one of the most common and basic ways to query or pull information from a database.
SQL is a relational database language, meaning it's designed to be able to tie different data tables together. For any tasks that have to do with analytics, you'll likely find SQL, Deepali Vyas, global head of fintech, payments, and crypto at Korn Ferry, told BI.
C++
Areas of interest: Low-latency applications
Firms using it: High-frequency trading players and exchanges
For applications and systems where speed (or a fast response time) is the name of the game, experience as a C++ developer is going to come in handy. That's especially true at high-frequency trading firms and exchanges, where companies edge each other out by being microseconds faster than the competition.
The coding language has a reputation for being trickier to master than others, and its ability to interact more closely with technical hardware can lead to nasty coding bugs, but it also generally affords the user more control and speed.
Tableau and Power BI
Areas of interest: Data visualization, front-office analysts
Firms using it: Wealth managers, banks
When Wall Street tech execs talk about data, it's often broken up into organizing it and finding insights within it.
Korn Ferry's Vyas said the latter benefits from tools like Tableau and Power BI, which visualize and contextualize data. These types of graphics are especially useful if you work in wealth management or advisory, where dashboards and data tables are regularly used.
Java
Areas of interest: Big banks with more legacy technology
Firms using it: Banks and some buy-side firms
Like Python, Java is widely used on Wall Street. The coding language secured an early foothold in the world of banking because it was believed to have security features that restricted data access, while also offering portability, or the ability to be transferred between machines.
As a result, many big banks are tethered to Java, but other firms like Two Sigma have also relied on the coding language.
Rust and Go
Areas of interest: App development
Firms using it: Fintechs, banks
Technically, many of the coding languages on this list β like Python, for example β are open source, or available for developers to use without a proprietary license.
But several open-source languages have become more in-demand in recent years, including Go and Rust. When the banking fintech Stash built much of its core banking offering from the ground up in 2022, tech leaders at the company highlighted the use of Go β which they said was picked up quickly by engineers and cut the implementation time for "substantial" new pieces of code to roughly 3.5 days.
Fintechs aren't the only financial firms embracing Go and other open source tools. At Blackrock, much of the firm's cloud work was built upon open-source software. Wells Fargo in recent years has embraced Rust and Go as languages the bank is becoming more comfortable.
Editor's note: This article was originally published in 2022 and has been updated with new information.
Former BI reporter Carter Johnson also contributed to the previous reporting.
This article is part of "CXO AI Playbook" β straight talk from business leaders on how they're testing and using AI.
The future of software-development jobs is changing rapidly as more companies adopt AI tools that can accelerate the coding process and close experience gaps between junior- and senior-level developers.
Increased AI adoption could be part of the tech industry's "white-collar recession," which has seen slumps in hiring and recruitment over the past year. Yet integrating AI into workflows can offer developers the tools to focus on creative problem-solving and building new features.
On November 14, Business Insider convened a roundtable of software developers as part of our "CXO AI Playbook" series to learn how artificial intelligence was changing their jobs and careers. The conversation was moderated by Julia Hood and Jean Paik from BI's Special Projects team.
These developers discussed the shifts in their day-to-day tasks, which skills people would need to stay competitive in the industry, and how they navigate the expectations of stakeholders who want to stay on the cutting edge of this new technology.
Panelists said AI has boosted their productivity by helping them write and debug code, which has freed up their time for higher-order problems, such as designing software and devising integration strategies.
However, they emphasized that some of the basics of software engineering β learning programming languages, scaling models, and handling large-scale data β would remain important.
The roundtable participants also said developers could provide critical insight into challenges around AI ethics and governance.
The roundtable participants were:
Pooya Amini, software engineer, Meta.
Greg Jennings, head of engineering for AI, Anaconda.
Shruti Kapoor, lead member of technical staff, Slack.
Kesha Williams, head of enterprise architecture and engineering, Slalom.
The following discussion was edited for length and clarity.
Julia Hood: What has changed in your role since the popularization of gen AI?
Neeraj Verma: I think the expectations that are out there in the market for developers on the use of AI are actually almost a bigger impact than the AI itself. You hear about how generative AI is sort of solving this blank-paper syndrome. Humans have this concept that if you give them a blank paper and tell them to go write something, they'll be confused forever. And generative AI is helping overcome that.
The expectation from executives now is that developers are going to be significantly faster but that some of the creative work the developers are doing is going to be taken away β which we're not necessarily seeing. We're seeing it as more of a boilerplate creation mechanism for efficiency gains.
Aditi Mithal: I joined Amazon two years ago, and I've seen how my productivity has changed. I don't have to focus on doing repetitive tasks. I can just ask Amazon Q chat to do that for me, and I can focus on more-complex problems that can actually impact our stakeholders and our clients. I can focus on higher-order problems instead of more-repetitive tasks for which the code is already out there internally.
Shruti Kapoor: One of the big things I've noticed with writing code is how open companies have become to AI tools like Cursor and Copilot and how integrated they've become into the software-development cycle. It's no longer considered a no-no to use AI tools like ChatGPT. I think two years ago when ChatGPT came out, it was a big concern that you should not be putting your code out there. But now companies have kind of embraced that within the software-development cycle.
Pooya Amini: Looking back at smartphones and Google Maps, it's hard to remember how the world looked like before these technologies. It's a similar situation with gen AI β I can't remember how I was solving the problem without it. I can focus more on actual work.
Now I use AI as a kind of assisted tool. My main focus at work is on requirement gathering, like software design. When it comes to the coding, it's going to be very quick. Previously, it could take weeks. Now it's a matter of maybe one or two days, so then I can actually focus on other stuff as AI is solving the rest for me.
Kesha Williams: In my role, it's been trying to help my team rethink their roles and not see AI as a threat but more as a partner that can help boost productivity, and encouraging my team to make use of some of the new embedded AI and gen-AI tools. Really helping my team upskill and putting learning paths in place so that people can embrace AI and not be afraid of it. More of the junior-level developers are really afraid about AI replacing them.
Hood: Are there new career tracks opening up now that weren't here before?
Verma:At Nice, we have something like 3,000 developers, and over the last, I think, 24 months, 650 of them have shifted into AI-specific roles, which was sort of unheard of before. Even out of those 650, we've got about a hundred who are experts at things like prompt engineering. Over 20% of our developers are not just developers being supported by AI but developers using AI to write features.
Kapoor: I think one of the biggest things I've noticed in the last two to three years is the rise of a job title called "AI engineer," which did not exist before, and it's kind of in between an ML engineer and a traditional software engineer. I'm starting to see more and more companies where AI engineer is one of the top-paying jobs available for software engineers. One of the cool things about this job is that you don't need an ML-engineering background, which means it's accessible to a lot more people.
Greg Jennings: For developers who are relatively new or code-literate knowledge workers, I think they can now use code to solve problems where previously they might not have. We have designers internally that are now creating full-blown interactive UIs using AI to describe what they want and then providing that to engineers. They've never been able to do that before, and it greatly accelerates the cycle.
For more-experienced developers, I think there are a huge number of things that we still have to sort out: the architectures of these solutions, how we're actually going to implement them in practice. The nature of testing is going to have to change a lot as we start to include these applications in places where they're more mission-critical.
Amini: On the other side, looking at threats that can come out of AI, new technologies and new positions can emerge as well. We don't currently have clear regulations in terms of ownership or the issues related to gen AI, so I imagine there will be more positions in terms of ethics.
Mithal: I feel like a Ph.D. is not a requirement anymore to be a software developer. If you have some foundational ML, NLP knowledge, you can target some of these ML-engineer or AI-engineer roles, which gives you a great opportunity to be in the market.
Williams: I'm seeing new career paths in specialized fields around ML and LLM operations. For my developers, they're able to focus more on strategy and system design and creative problem-solving, and it seems to help them move faster into architecture. System design, system architecture, and integration strategies β they have more time to do that because of AI.
Jean Paik: What skills will developers need to stay competitive?
Verma: I think a developer operating an AI system requires product-level understanding of what you're trying to build at a high level. And I think a lot of developers struggle with prompt engineering from that perspective. Having the skills to clearly articulate what you want to an LLM is a very important skill.
Williams: Developers need to understand machine-learning concepts and how AI models work, not necessarily how to build and train these models from scratch but how to use them effectively. As we're starting to use Amazon Q, I've realized that our developers are now becoming prompt engineers because you have to get that prompt right in order to get the best results from your gen-AI system.
Jennings: Understanding how to communicate with these models is very different. I almost think that it imparts a need for engineers to have a little bit more of a product lens, where a deeper understanding of the actual business problem they're trying to solve is necessary to get the most out of it. Developing evaluations that you can use to optimize those prompts, so going from prompt engineering to actually tuning the prompts in a more-automated way, is going to emerge as a more common approach.
Igor Ostrovsky: Prompt engineering is really important. That's how you interact with AI systems, but this is something that's evolving very quickly. Software development will change in five years much more rapidly than anything we've seen before. How you architect, develop, test, and maintain software β that will all change, and how exactly you interact with AI will also evolve.
I think prompt engineering is more of a sign that some developers have the desire to learn and are eager to figure out how to interact with artificial intelligence, but it won't necessarily be how you interact with AI in three years or five years. Software developers will need this desire to adapt and learn and have the ability to solve hard problems.
Mithal: As a software developer, some of the basics won't change. You need to understand how to scale models, build scalable solutions, and handle large-scale data. When you're training an AI model, you need data to support it.
Kapoor: Knowledge of a programming language would be helpful, specifically Python or even JavaScript. Knowledge of ML or some familiarity with ML will be really helpful. Another thing is that we need to make sure our applications are a lot more fault-tolerant. That is also a skill that front-end or back-end engineers who want to transition to an AI-engineering role need to be aware of.
One of the biggest problems with prompts is that the answers can be very unpredictable and can lead to a lot of different outputs, even for the same prompt. So being able to make your application fault-tolerant is one of the biggest skills we need to apply in AI engineering.
Hood: What are the concerns and obstacles you have as AI gains momentum? How do you manage the expectations of nontech stakeholders in the organization who want to stay on the leading edge?
Ostrovsky: Part of the issue is that interacting with ChatGPT or cloud AI is so easy and natural that it can be surprising how hard it is actually to control AI behavior, where you need AI to understand constraints, have access to the right information at the right time, and understand the task.
When setting expectations with stakeholders, it is important they understand that we're working with this very advanced technology and they are realistic about the risk profile of the project.
Mithal: One is helping them understand the trade-offs. It could be security versus innovation or speed versus accuracy. The second is metrics. Is it actually improving the efficiency? How much is the acceptance rate for our given product? Communicating all those to the stakeholders gives them an idea of whether the product they're using is making an impact or if it's actually helping the team become more productive.
Williams: Some of the challenges I'm seeing are mainly around ethical AI concerns, data privacy, and costly and resource-intensive models that go against budget and infrastructure constraints. On the vendor or stakeholder side, it's really more about educating our nontechnical stakeholders about the capabilities of AI and the limitations and trying to set realistic expectations.
We try to help our teams understand for their specific business area how AI can be applied. So how can we use AI in marketing or HR or legal, and giving them real-world use cases.
Verma: Gen AI is really important, and it's so easy to use ChatGPT, but what we find is that gen AI makes a good developer better and a worse developer worse. Good developers understand how to write good code and how good code integrates into projects. ChatGPT is just another tool to help write some of the code that fits into the project. That's the big challenge that we try to make sure our executives understand, that not everybody can use this in the most effective manner.
Jennings: There are some practical governance concerns that have emerged. One is understanding the tolerance for bad responses in certain contexts. Some problems, you may be more willing to accept a bad response because you structure the interface in such a way that there's a human in the loop. If you're attempting to not have a human in the loop, that could be problematic depending on what you want the model to do. Just getting better muscle for the organization to have a good intuition about where these models can potentially fail and in what ways.
In addition to that, understanding what training data went into that model, especially as models are used more as agents and have privileged access to different applications and data sources that might be pretty sensitive.
Kapoor: I think one of the biggest challenges that can happen is how companies use the data that comes back from LLM models and how they're going to use it within the application. Removing the human component scares me a lot.
Verma: It's automation versus augmentation. There are a lot of cases where augmentation is the big gain. I think automation is a very small, closed case β there are very few things I think LLMs are ready in the world right now to automate.