Skip to content

First AI came for the grunts….

As time goes by, I've gotten less and less interested in studies showing how good modern AI is. Many of them are fascinating, but I want to see real-world examples. Today, the New York Times says Wall Street is about to start replacing junior financial analysts in enormous numbers:

The jobs most immediately at risk are those performed by analysts at the bottom rung of the investment banking business, who put in endless hours to learn the building blocks of corporate finance, including the intricacies of mergers, public offerings and bond deals. Now, A.I. can do much of that work speedily and with considerably less whining.

....The software, being deployed inside banks under code names such as “Socrates,” is likely not only to change the arc of a Wall Street career, but also to essentially nullify the need to hire thousands of new college graduates.

Fine. It was bound to happen. But if Wall Street investment banks stop hiring entry level analysts, where will they get their associates and directors and managing partners? There will be no one to move up the ranks.

This is maybe not the biggest issue with AI, but it's a very general one since it's the grunt jobs AI will take over first. Where will all the senior reporters and senior bookkeepers and senior coders come from if AI fills all the junior ranks?

31 thoughts on “First AI came for the grunts….

  1. antiscience

    I read your post with amusement. It's widely-known that most of these "analysts" are just regurgitating pap, and a lot of what they produce is determined by who signs their paycheck. It's not for nothing that almost all 'sell-side' analysts are as useless as tits on a boar. And even those analysts who style themselves as "neutral" are in fact heavily biased towards the sell-side: it's very rare to find an analyst whose *recommendation errors* aren't heavily biased towards their "buy" recommendations. That is to say, when analysts make errors, it's somehow and inexplicably (haha, it's very explicable) towards making too many "buy" recommendations.

    This culling of low-level analysts in the financial biz couldn't happen to nicer folks, and I look forward to their redeployment in retail services roles. As in: "Espresso for Kevin! Espresso for Kevin!"

    1. MF

      Actually, Wall Street competes with Silicon Valley for the best and brightest university and business school graduates. They will not be doing "Espresso for Kevin" even if the entry level Wall Street jobs vanish.

      They will head off to law school or consulting or other fields almost as lucrative as Wall Street. Those who would have had those opportunities will also move one rung down the ladder. The people who end up hawking espressos will be the people from third tier schools with liberal arts degrees who might otherwise have gotten a passable job, gone into teaching, etc.

  2. rick_jones

    The AI won’t be replacing 100% of them. Unless span of control is very narrow, there will still be fodder for the higher levels. At least until those too are subsumed. NYC, Chicago, and SF should be just as concerned as young b-school students…

  3. jonziegler

    Hey Kevin,

    This general issue was the basis of a book, Mind Over Machine, published in 1986. Dreyfuss & Dreyfuss authors.

    https://dl.acm.org/doi/book/10.5555/7916

    Gerard Salton’s review covers some of their arguments.

    I think that the simple answer today is the entire process of taking over the decision-making process at all levels will happen so fast (within a few years) that the issue is moot. We will have no human experts.

    This is pretty much the same as there being no human experts left in many arts that are done by machine nowadays. There are hobbyists, but no actual experts.

    And yes, this is scary.

    Regards,
    Jon

    1. roboto

      Not nearly as scary as the "experts" who came up with insane policies during Covid: lockdowns, schools closing for a year or a year and a half, mask mandates and mRNA shot mandates.

      1. Crissa

        Ahh, yes, the 'insane' experts who...

        *checks notes*

        Made recommendations that if followed, reduced the death rate.

        Businesses that masked had fewer excess sick days; populations that vaccinated had fewer deaths; shutting down certain activities led to fewer super-spreader events. And school kids didn't do magically better when their schools ignored the parents.

  4. James B. Shearer

    "... There will be no one to move up the ranks."

    For one thing the AI systems will get more capable over time and move into higher level positions. For another thing in many cases the low level experience isn't that useful and people will start higher up.

    1. iamr4man

      I think that’s what Kevin was getting at when he “asked” that question. Note that the answer is implied by the headline.

  5. illilillili

    > Where will all the senior reporters and senior bookkeepers and senior coders come from

    Nepotism will still be alive and well, but even more so.

    1. different_name

      Yeah, this. "AI" is going to be massive moat, enabling a new infestation of dynastic mediocrities controlling things.

      Just wait until drones turn cops into video game operator/clerks (you'll need a human in the loop until criminal procedure allows the robots to testify against humans).

  6. megarajusticemachine

    Very nice (and revealing) of them to accuse displaced workers are being "whiners" after saying they had put in "endless hours".

  7. Murc

    I guess my question here is "who is assuming the liability here?"

    Financial institutions are not the legally-immune leviathans we imagine them to be, especially not when sued by other financial institutions or equally large and powerful entities. They have to warrant their work and their analysis in certain ways; sign off on it in ways that say "we say this is work is accurate and are willing to put our asses on the line, legally speaking, to stand by that statement."

    That means that either a human being somewhere or the institution as a whole is liable if and when someone hits a button to start these tools working and it turns out the tool wasn't fit-for-purpose, like trying to drive nails with a screwdriver. You can fire a junior analyst that fucks up; in some cases, if the fuckup is bad enough, that guy can go to prison. You can't fire a machine, you can only stop using it oh wait you don't have an easy replacement do you.

    The courts have so far been taking a dim view of "its not our fault, its the machine, we shouldn't be held responsible."

    1. golack

      There can be investor revolts and even lawsuits if they don't employ the latest tools to make the most money.

      Curious to see if number of lawsuits spikes with AI written torts, etc., making it easier (and cheaper?) to file over everything...

  8. D_Ohrk_E1

    where will they get their associates and directors and managing partners?

    The winnowing is vertical. Fewer analysts lead to fewer directors and more money for the managing partners.

  9. jdubs

    Well, this is certainly the first time ever that leadership has sold slashing low level jobs because new tech fad will change work and the world forever.

    Its all different this time Im sure...just like it was last time and the time before that. There is a lot of 'AI' to sell and a lot of corporate restructuring to sell....selling this same story is always the first key groundwork to get everyone into a fever pitch and ready to buy the sales pitches to come. Its going to be relentless (AI scissors, AI hamburgers, AI pillows, AI tulips), have to make sure the cheerleaders are in place first. The sales story doesnt change, but the targets do.

  10. Vog46

    I am old so a lot of this stuff seems foreign to me
    But it also seems like AI will replace of lot of people who make their living in advisory roles. Stock analysts is just one of many I could think of. The question becomes how will they deploy AI in this business? Will it be a case of the first company to deploy AI wins? And how they deploy AI will also be important

    Will it be available ONLY to those that can afford it? Or, will the common person be able to take advantage of AI's computing power?

  11. Jasper_in_Boston

    Today, the New York Times says Wall Street is about to start replacing junior financial analysts in enormous numbers:

    What the NY Times reports Wall Street is going to do and Wall Street is going to do aren't necessarily the same thing. Also, if it's anything like past waves of financial services automation, we may see that particular job category decline, while recruitment and salary dollars are transferred to other areas that produce greater profits (in other words, no net decline in Wall Street jobs).

  12. tango

    I know that the market for finance majors is not collapsing all at once. But as the father of a business major (a Junior), I am very glad that my son did not choose to concentrate in Finance.

    That begs the question though, what IS a safe major for a college student hoping to start a career in the face of AI?

      1. tango

        Hah, I was talking about this subject with my son just the other day and that was exactly what we came up with! Of course, most folks want to be white collar these days...

  13. Chip Daniels

    Time for me to bang the drum again on how AI is going to have its biggest impact on the white collar professions because most of what they do is very amenable to algorithm.
    Diagnosing an illness and prescribing a course of treatment; analyzing a tort, researching case law, and suggesting a legal theory; Taking inputs and crafting an engineering design; These are all part of the big three white collar professions and all are very much what an algorithm can do faster and better.

    ETA: Here is a website which designs a simple beam with only user inputs; When I started in architecture in 1981, this was being done by slide rule and pencil by a trained engineer.
    https://clearcalcs.com/freetools/beam-analysis/us

  14. KJK

    Around Thanksgiving, I spoke to a NYC lawyer who told me that AI is reducing the number of first year associates that his firm will be hiring, so it comes to no surprise that AI would materially impact banking and investment banking associates in a similar manor.

    How far up the chain this will occur depends on how smart AI gets, or when AI gets smart enough, to provide useful work beyond gathering up information, writing the 1st draft of company backgrounds, and spreading numbers for credit and investment analysis. Sound judgment and interpersonal skills are what is needed more senior positions, since almost all business decisions occur within an environment of incomplete information and uncertainty.

  15. bethby30

    I literally just listened to a recent Science Friday podcast about AI being better at predicting cardiovascular disease from chest X-rays than the best assessment tool now in use. If I understood correctly they aren’t sure what was being detected in the chest X-rays that was key to the prediction. AI was also apparently able to detect diabetes from chest X-rays from detecting fat deposits.

    https://www.sciencefriday.com/segments/ai-heart-disease-chest-x-rays/

    1. Doctor Jay

      I listened to this, and it's interesting. Here are some takeaways, and interesting points, as far as my own impression of it is concerned:

      * They don't say how much better the prediction is, or how good the current predictions are. This is valuable context.
      * The probable channels for the prediction are calcium score and ejection fraction, there is a possibility that fatty deposits might be picked up.

      * The answer to "how did the AI manage to figure this out?" is that it used math and statistics on a very large dataset, and just kept trying to find patterns. This is what humans who do this do, too. Computers are faster and they don't have to sleep or eat.

  16. Doctor Jay

    Everything I've seen about LLMs suggests that they simply have no function that asks "Is this correct?" They will cite dates from the wrong year, because there's nothing in the training that dings that.

    Now, it seems to me that maybe the kinds of neural nets that the financial industry wants to use are quite different from LLMs. I dunno, maybe they can make this work, but maybe it will be a problem.

    Humans are messy and unpredictable, which is why you want other humans interacting with them, not machines. The machines can do a lot of the grunt work, and if you have a lot of repetitive grunt work, even if it's filling out spreadsheets every day with complex and valuable financial information, then you might do well to get a computer to manage it.

  17. jeffreycmcmahon

    Hollowing out the finance sector and depleting it of real-world knowledge and experience, eventually and inevitably crippling it, is actually the best possible use of this technology.

Comments are closed.