Java Can Call COBOL Now. That Doesn’t Solve Your Problem.
Using FFM and GnuCOBOL is simple. Extracting real business logic from legacy systems is not
There are more COBOL transactions processed every day than there are Google searches. An estimated $3 trillion moves through COBOL systems daily. More new COBOL is written each year than Go. And as of right now, you can have a working COBOL program compiled and running on your MacBook in under five minutes.
This is not a joke. Well, it’s April 1st, so the idea to write about it started as one. But somewhere between getting GnuCOBOL installed and watching Java 24 call a compiled COBOL shared library via the Foreign Function & Memory API, the joke stopped being funny and started being interesting.
So let’s do both things. We’ll write and run COBOL on macOS, call it cleanly from modern Java, and then talk honestly about why none of that is actually the hard part of COBOL modernization.
Part 1: COBOL on macOS
Install GnuCOBOL
GnuCOBOL is a mature, open-source COBOL compiler that translates COBOL to C and compiles it with your system toolchain. On macOS arm64:
brew install gnucobolVerify it works:
cobc --versionYou should see something like GnuCOBOL 3.x. That’s your compiler. That’s all you need.
Write some COBOL
COBOL has a reputation for verbosity, and it earns it. The language was designed in 1959 to be readable by business managers as well as programmers. Every program is divided into DIVISIONs.
GnuCOBOL supports free-format COBOL, so you don’t need to worry about the fixed column layout you’ll see in mainframe codebases. You’ll encounter the fixed format when reading real legacy code, but for writing new programs GnuCOBOL detects free format automatically.
We’ll write two files: the subprogram containing the business logic, and a driver to verify it works before we involve Java. This is also how real COBOL codebases are structured. Subprograms are always called, never run directly.
calcinterest.cob — the subprogram:
IDENTIFICATION DIVISION.
PROGRAM-ID. CALCINTEREST.
DATA DIVISION.
WORKING-STORAGE SECTION.
01 WS-RESULT PIC 9(9)V99.
LINKAGE SECTION.
01 LK-PRINCIPAL PIC 9(7)V99.
01 LK-RATE PIC 9(3)V9999.
01 LK-YEARS PIC 9(2).
01 LK-RESULT PIC 9(9)V99.
PROCEDURE DIVISION USING LK-PRINCIPAL LK-RATE
LK-YEARS LK-RESULT.
COMPUTE WS-RESULT = LK-PRINCIPAL *
(1 + LK-RATE / 100) ** LK-YEARS
MOVE WS-RESULT TO LK-RESULT
EXIT PROGRAM.
A few things worth noting:
PIC 9(7)V99means a numeric field: 7 digits, then an implied decimal point, then 2 more digits. COBOL’s type system is built entirely around fixed-width decimal arithmetic — designed for financial data long before IEEE 754 existed.WORKING-STORAGEis local state.LINKAGE SECTIONdefines the parameters passed in from the caller.COMPUTEsupports basic arithmetic including exponentiation (**). Not as alien as it looks.EXIT PROGRAMreturns control to the caller.Make sure to have the extra blank line at the very end!
driver.cob — standalone verification:
IDENTIFICATION DIVISION.
PROGRAM-ID. DRIVER.
DATA DIVISION.
WORKING-STORAGE SECTION.
01 WS-PRINCIPAL PIC 9(7)V99 VALUE 10000.00.
01 WS-RATE PIC 9(3)V9999 VALUE 4.5000.
01 WS-YEARS PIC 9(2) VALUE 10.
01 WS-RESULT PIC 9(9)V99.
PROCEDURE DIVISION.
CALL 'CALCINTEREST' USING WS-PRINCIPAL WS-RATE
WS-YEARS WS-RESULT
DISPLAY 'Result: ' WS-RESULT
STOP RUN.
Compile and test
Compile the subprogram as a shared library. On macOS, GnuCOBOL resolves CALL 'CALCINTEREST' by looking for a file named CALCINTEREST.dylib. Capitalisation and extension both matter:
cobc -m -o CALCINTEREST.dylib calcinterest.cob
cobc -x -o driver driver.cob
COB_LIBRARY_PATH=$(pwd) ./driverOutput:
Result: 000015529.69$10,000 at 4.5% compounded annually over 10 years. The arithmetic is correct. You now have verified COBOL business logic running on your Mac.
Part 2: Calling It from Java with FFM
Java 22+ includes the Foreign Function & Memory API as a stable feature. If you’ve used it before, this will be familiar. If not, the short version is that FFM lets you call native libraries without writing a single line of C or JNI boilerplate.
You can read more about it in my older articles:
We’ll use jextract to generate Java bindings directly from the compiled library, keeping this as frictionless as possible.
Generate bindings with jextract
jextract ships separately from the JDK. Download it from jdk.java.net/jextract and make sure it’s on your PATH.
We need a C header file describing the COBOL function signature. Two things are worth knowing before you write it.
First, COBOL’s PIC 9(n)V99 fields are not floats or doubles. They’re fixed-width decimal strings with an implied decimal point. GnuCOBOL’s DISPLAY format stores them as ASCII digits. So we declare everything as char * and handle the implied decimal on the Java side.
Second, GnuCOBOL requires its runtime to be initialised before any COBOL module executes. The runtime exposes cob_init for exactly this purpose. We include it in the header and call it first.
Create calcinterest.h:
void cob_init(int argc, char **argv);
void CALCINTEREST(
char *principal,
char *rate,
char *years,
char *result
);Now generate the bindings, linking against both CALCINTEREST and the GnuCOBOL runtime cob:
jextract \
--output src \
-l CALCINTEREST \
-l cob \
--target-package cobol \
calcinterest.hThis produces a cobol/calcinterest_h.java with all the FFM plumbing written for you.
Call it from Java
import java.lang.foreign.Arena;
import java.lang.foreign.MemorySegment;
import cobol.calcinterest_h;
public class CobolDemo {
public static void main(String[] args) throws Throwable {
try (Arena arena = Arena.ofConfined()) {
// Initialise the GnuCOBOL runtime before any COBOL module executes
calcinterest_h.cob_init(0, MemorySegment.NULL);
// Pass values as fixed-width decimal strings matching the PIC clauses:
// PIC 9(7)V99 → 9 digits, decimal implied at position 7
// PIC 9(3)V9999 → 7 digits, decimal implied at position 3
// PIC 9(2) → 2 digits
// PIC 9(9)V99 → 11 digits, decimal implied at position 9
MemorySegment principal = arena.allocateFrom("001000000"); // 10000.00
MemorySegment rate = arena.allocateFrom("0045000"); // 4.5000
MemorySegment years = arena.allocateFrom("10");
MemorySegment result = arena.allocate(12); // 11 digits + null
calcinterest_h.CALCINTEREST(principal, rate, years, result);
// Insert the implied decimal: PIC 9(9)V99 = 9 integer + 2 decimal digits
String raw = result.getString(0);
double output = Double.parseDouble(raw.substring(0, 9) + "." + raw.substring(9));
System.out.printf("Compound interest result: %.2f%n", output);
}
}
}Compile and run
libcob.dylib is installed by Homebrew but not on the default dynamic linker path. Use DYLD_LIBRARY_PATH to tell the OS where to find it: -Djava.library.path is not sufficient here, as FFM’s SymbolLookup goes through the OS linker directly, not the JVM:
javac -cp src src/CobolDemo.java
DYLD_LIBRARY_PATH=.:/opt/homebrew/opt/gnucobol/lib \
java --enable-native-access=ALL-UNNAMED -cp src CobolDemoOutput:
Compound interest result: 15529.69Java just called compiled COBOL business logic through a generated FFM binding with no JNI, no glue code, and no C. The type mapping required some care. Char * buffers sized to match the COBOL PIC clauses, cob_init to boot the runtime. But the jextract stub handled everything structural.
Take a moment to appreciate that this actually works. Modern Java is quietly very good at this.
Part 3: So Why Is Any of This Still a Problem?
Here’s where the April Fools’ framing earns its keep.
If calling COBOL from Java is this clean, a Homebrew install, a compiler flag, a generated stub, then what exactly is the multi-billion dollar COBOL modernization problem everyone keeps talking about?
It’s not the interop. It never was.
The documentation problem
Enterprise COBOL codebases are not documented. Not poorly documented — undocumented. The developers who wrote them retired, or died, or have been out of contact for twenty years. What comments exist are frequently wrong, describing what the code was supposed to do in 1987, not what it actually does after forty years of patches.
Variable names like WS-FIELD-047, ACCT-PROC-TMP-2, and CALC-X are not unusual. The business logic lives in the behavior, not the source, and the only way to extract it is to run the code with known inputs and observe outputs. To make it worse, you need to do this systematically, at scale, for every edge case your business has accumulated across decades.
Architecture mismatch
COBOL programs were designed around batch processing. They read sequential files, process records one at a time, write output files, and exit. The mental model is a job, not a service.
That compound interest function above is a friendly fiction. Real COBOL business logic doesn’t decompose into clean functions with typed parameters. It’s often a single PROCEDURE DIVISION spanning thousands of lines, reading from multiple input files, writing to multiple output files, with behavior that varies based on flags set earlier in the job stream. Those flags live in JCL (Job Control Language), not in the COBOL source at all.
Wrapping that in a REST endpoint is not a technical challenge. It’s a conceptual one. What is the service boundary? Where does one “function” end and another begin? Nobody knows. That knowledge was never written down.
The mainframe is not just a runtime
GnuCOBOL on a MacBook is a genuine, working COBOL compiler. It is not what runs the bank.
Production COBOL runs on IBM z/OS, under CICS (a transaction monitor), talking to VSAM files and DB2 databases, scheduled by JES2, with memory models and I/O characteristics that have no equivalent anywhere else in computing. The mainframe is not a slow x86 server. It’s a different computational universe with fifty years of operational assumptions baked in.
When people say “we’ll just recompile it on Linux,” they’re solving the syntax problem and ignoring the environmental one. The COBOL might compile. The surrounding infrastructure, the file systems, the transaction semantics, the batch scheduler integration, doesn’t have a port.
The high-profile graveyard
An the failed trials are publicly documented and expensive:
Commonwealth Bank of Australia spent $750 million and over five years on a core banking modernization. It’s not clear or documented, how that effort ended.
TSB Bank in the UK migrated off a legacy platform in 2018. The cutover failed catastrophically. 1.9 million customers were locked out of their accounts for days. The CEO resigned. The eventual cost exceeded £330 million. A Parliamentary report later found the bank had underestimated the complexity of the data migration and the interdependencies in the legacy system.
The United States FAA has been attempting to modernize COBOL-based air traffic control systems for decades. Multiple programs have been cancelled, restarted, and redesigned. The systems still run. End of 2025 we got another public update on another NextGen approach.
In each case, the proximate cause of failure was not “COBOL is hard to rewrite.” It was insufficient understanding of what the existing system actually did. Especially at the edges, under load, in failure conditions. Before the rewrite began.
The Honest Not April 1st Takeaway
Let’s be precise about something, because it tends to get lost in the modernization conversation:
Rewriting code in another language has never been the challenge. It will never be the challenge.
Code is the easy part. Code is the artifact left behind by decisions. The challenge is recovering the decisions themselves and understanding why the code does what it does, what business rules it encodes, what failure modes it handles (deliberately or accidentally), and what the surrounding system assumes about its behavior.
A COBOL-to-Java rewrite that faithfully reproduces the source is often worse than useless, because it replicates not just the business logic but the bugs, the workarounds, and the architectural constraints of 1970s batch processing. Just now expressed in Java, which provides none of the operational tooling that made those constraints manageable on z/OS.
The FFM approach demonstrated in this article is actually a reasonable modernization strategy. You call the existing COBOL. You write tests against the real behavior. You understand what it does. Then, incrementally, you replace pieces you understand with Java you trust. The strangler-fig pattern applied to a fifty-year-old codebase.
Before we get there, a moment of genuine appreciation is warranted. IBM has maintained backward compatibility on the mainframe across decades of hardware generations. COBOL written in the 1970s runs unmodified on a modern z16. That is not an accident. It is the result of deliberate, sustained engineering discipline that the rest of the industry routinely fails to match. While everyone else was breaking APIs and deprecating runtimes, IBM quietly kept the lights on for the most critical financial infrastructure on the planet. That deserves respect, not condescension. The mainframe didn’t survive because organisations were too lazy to migrate. It survived because IBM made survival the path of least resistance, and because the alternative kept proving catastrophic.
The goal of modernization should be manageability: the ability for a team today to understand, modify, and operate the system. Sometimes that means rewriting in Java. Sometimes it means wrapping in a well-tested service boundary. Sometimes, it means leaving the COBOL alone and putting better observability around it.
What it never means is treating the language as the problem.
The code in this article runs. GnuCOBOL is real, FFM is stable, and jextract saves you the JNI ceremony. If you want to spike this yourself, the entire setup takes about twenty minutes.
The mainframe problem takes considerably longer.




