- calendar_today September 3, 2025
Financial institutions around the world are planning to use artificial intelligence (AI) and automation to replace swathes of their workers, with as many as 200,000 jobs predicted to be lost across the industry in the coming years. Australia’s biggest bank, Commonwealth Bank of Australia (CBA), has been embroiled in a major public relations disaster of its own making after it had to reinstate 45 workers over the summer following a complaint from their union. The row began after the bank fired its employees for reasons it subsequently denied, leading to an embarrassing public relations disaster and a retraction of their sacking.
The CBA workers were informed that their jobs were being made redundant by the bank’s new AI “voice bot” system, only for the bank to subsequently concede it had no proof the workloads had declined as a result of automation. Some of the employees had been with the bank for up to 25 years, while it appeared workloads had actually increased in the period their jobs were made redundant.
The case arose when the bank announced to a group of staff that their jobs were no longer necessary because of the introduction of the “voice bot” technology. Staff had previously been informed of an increase in their workloads, leading to managers being temporarily re-assigned to cover calls and existing workers being asked to work overtime. The bank argued that its technology had cut customer call numbers by 2,000 per week, meaning there was no longer a need for as many employees. However, after the initial announcement, employees responded angrily, with many pointing out that call numbers had been on the rise, not in decline.
In what it described as a “landmark” case, the Finance Sector Union (FSU) then took CBA to the tribunal, arguing the bank’s redundancies had not been properly explained. In particular, it was claimed that the decision was presented as one based on the use of a chatbot, when in fact the workloads had been on the rise in the months preceding the announcement. As a result, it was not clear how many staff could have been spared, and yet it was later admitted that call volumes had increased. Union lawyers also argued CBA had failed to properly consult staff about the change before it was announced.
By the time of a tribunal hearing in late July, CBA had made a dramatic admission. In a document read in the hearing, it agreed “at the time of making the selection for redundancy, the roles were not redundant.” This was due to a rise in call volumes over the preceding months, a spike that continued after the redundancies were made and directly contradicted the original case for layoffs. This sudden increase in workload, which the bank subsequently recognized, went against its claim that the chatbot had “automated the handling of common questions, reducing incoming calls by approximately 2,000 per week.”
The dispute has now been resolved after CBA issued an apology and reinstated its 45 workers, giving them the option of returning to their old roles, finding other roles in the company, or accepting a redundancy package. CBA has confirmed that it made a mistake, telling Bloomberg the affected staff were contacted and offered reinstatement, and said it had learned lessons from the controversy. “We have apologized to the employees concerned and acknowledge we should have been more thorough in our assessment of the roles required,” the spokesperson added.
The FSU described the reversal of the layoffs as a “massive win” for its members, but said the action had still caused real harm, with workers left in limbo for weeks and uncertainty about how to feed and pay their families. It also suggested that this was a wider warning to banks across the world to think before they turn to automation without assessing the full implications for affected staff.
In the statement, the union said the company had no evidence that its staff had been correctly informed and that the chatbot announcements had been made under pretenses. CBA has still not commented on this specific allegation but noted that it “acted in a manner that was fair and consistent with our obligations.”
The CBA case is a blow to the bank’s ambitions in artificial intelligence (AI), but not an end to its deployment of this technology. The bank’s new partnership with OpenAI to develop generative AI applications was announced just last week, and the bank’s declared interest in the application of AI in areas including fraud and scam detection shows the technology is set to continue to be a focus for the bank going forward. The agreement with OpenAI is set to “expand and accelerate” CBA’s use of generative AI, and the bank has been quick to emphasize that the partnership will be used to “better serve customers and support employees.”
The workers at the center of the row now face the decision of whether to return to the company after having been told their jobs no longer existed. The FSU said that many would not return given that trust had been irreparably damaged, but that ultimately, the decision was in their hands. The bigger picture for CBA and the rest of the industry in Australia has been more complicated, with a victory for the union paving the way for a second complaint over its use of AI.




