Sigh. Bloomberg-er here -- this article is old. There is certainly a lot of Fortran remaining in sections (you don't rewrite working code with C linkage in another language just because -- you rewrite it if you have something to gain), but it is nowhere near as bad as this old article makes it sound. Many teams are C++ only and have been for a long time (see our open-source code on GH for a sample). Keep in mind that if you deal in absolute numbers you are missing the bigger picture -- the actual real number will be much smaller and it will be an even smaller percentage of the total codebase.
edit: Can mods put (2006) in the title? Seems reasonable...
Seconded. I worked at Bloomberg from 2003-2007, and over the course of that time I think I needed to change about five lines of Fortran total. It was mostly used in "if it ain't broke" systems that weren't seeing a lot of change; everything under active development had long since been rewritten in C, C++, or a higher-level language.
I'd be interested to know what kind of systems remain Fortran. Fortran has some serious advantages over C or C++ when it comes to high performance math and stats. I wouldn't be surprised if a lot of the heavy number crunching is done in Fortran, while C++, etc. are taking over functions that are further away from the hard math.
One fun thing you can do in Fortran (on a 32-bit machine) is compare four-character strings using integer equality, rather than the Fortran equivalent of strcmp(). And it's basically baked into the language; you don't have to do weird typecasting or anything like that.
That's why all the Bloomberg functions have four-letter names.
What REAL advantage does the switch to C++ from Fortran have for you?
And NO not knowing fortran doesn't count if at 18 at my first job was told oh hear is a book on fortran learn it your graduates with a fancy CS degree can do the same.
When you actually take a look at the bulk of FORTRAN, it looks suspiciously like C -- and it has C linkage. So, to me personally, I lump FORTRAN and C together in my head and then the question becomes "What real advantage does the switch to C++ from C have for you?". You can find vast amounts of information on the web to answer that question :)
I have not known a single person in over a decade to go out and specifically learn FORTRAN as a separate language. Schools still provide graduates with C programming skills and any C programmer comfortable with the language can not only immediately read most FORTRAN, but can modify it as well.
> "What real advantage does the switch to C++ from C have for you?". You can find vast amounts of information on the web to answer that question :)
As well as vast numbers of religious flamewars. I'm personally of the camp that (C || Fortran || "C with classes") + (Java || C# || Python || ${HIGHER_LEVEL_LANGUAGE}) is better for nearly any complex, performance-sensitive task.
People also have this false preconception that you have to write everything in Fortran77. You can keep a F77 codebase alive, while writing Fortran 90 (0) code which is surprisingly modern, has amazing mathematical computational performance (especially with Intel's MKL and compiler), and the language has amazing OpenMP integration.
I'm with you on combining the best tool for the job and sometimes having a Java/C/Python wrapper is useful, but you can retain your Fortran tried-and-true-and-tested code-base while adding feature enhancements compiling against the Fortran2003 standard and often eliminate the need for wrappers. I guess the practical problem is finding F2003 engineers is way more challenging than a Java or C# dev.
0: F90 was the "big" upgrade that came out which made Fortran "modern", but there have been continuous upgrades like F95, and Fortran 2003 (I guess this would be analogous to C++17) which make it perfectly fine to write.
> I have not known a single person in over a decade to go out and specifically learn FORTRAN as a separate language.
I much prefer c/cpp, but fortran remains pervasive in scientific computing and being conversational in it remains a prerequisite for fields in high performance computing, IMO.
The fact that I can do something does not mean that I would like to do it. Learning Fortran from scratch in 2015 might be bad career decision for a new grad.
For someone starting out, there's such a thing as opportunity loss. What language would that person had learned if they had not spent that time learning Fortran? Would that language had been more marketable in their next job?
Further, what you used your last job matters. Particularly if that job is the _only_ real job since you got out of school. If you just learned Fortran for fun or for a job in a long list of jobs and your github account is littered with good work in multiple languages, Fortran can indicate flexibility, diversity, and a a broad skill set.
If all I see is one job with Fortran for 2 or 3 years after school I might hesitate to hire you for a <insert language here> job at anything but entry level.
edit
Just a note, I think Fortran is a fine language and anyone whose done a decent amount of linear algebra work knows what a big role Fortran plays in analytics.
My main point is that you should care about what your first job(s) are coming out of school because they can shape what the next jobs are going to be. So make sure it's something you want to be doing.
The negative is that time is a finite and limited resource. If a junior person spends their time learning about Fortran or some other esoteric thing while their peers learn skills or languages that are in demand, the junior person that learned Fortran will be at a disadvantage.
Of course, if you're doing it just for fun, then I'd say there's no negative. But from a career perspective, you'd be better off learning more relevant skills.
> Learning Fortran from scratch in 2015 might be bad career decision for a new grad.
Isn't it still commonly used for scientific computing? So, maybe it would enable you to work on other kinds of neat stuff than the latest javascript UI framework.
It is still very common in scientific computing, to the point that some of the best researchers in the field teach online classes using it: https://www.coursera.org/course/scicomp
Not necessarily. Even just a rudimentary familiarity with it, or another language like COBOL, they could find a six-figure gig right out of school in several markets. They don't need to specialize in it, just knowing enough to get by will make them coveted by hundreds of defense and government contractors, for example.
If you read over BDE docs on Github you'll see that it currently targets C++03. It has a few C++11 detected features, and each new version release gains more C++11/14 support. So to answer your question -- neither. The vast majority of code is Modern C++03 and it is in the process of moving towards C++11/14.
It's more exasperation than anything else.. that the only things that seem to make the news cycle wrt Bloomberg are "OMG!! Fortran!!" when it comes to programming, and the UXMag we-know-more-than-you 5-year old article when it comes to our UX design and roadmap, coupled with the fact that people think a 10 year old gif they found on Google Images is what the software actually looks like today.
IMO, comment sections aren't really a great place for discoverable information, so maybe I'm thinking more in-depth posts need to be created that actually peek behind the curtain a bit so they are ranked better when searching. Right now, you can only glean information about certain areas/technologies from individuals that represent them if you know where to look. e.g. Here's a recent video interview with Matt Hunt from our Portfolio Analytics team regarding some of their problems/infrastructure https://www.youtube.com/watch?v=cMj2V3U4J3k
I think this kind of obsolete information may also hinder them when they try to hire. Good engineer may have a bad perception about how it is to work there and overlook them.
They could change that perception with being more open about how they go about developing software (like their GitHub account). But these article showing up every one and a while goes against their efforts hence why I can see why OP may get annoyed about it.
Last time I was in the Bloomberg building much of their Dev Dept was underground, and they have a neon art display which is supposed to simulate a sunset, or natural light, or something.
Sure, visiting Bloomberg and getting the tour leaves you with a wonderful expression, but those that I know that have worked there in the past wouldn't compare the work environment to Google.
Bloomberg is much more of an "East Coast" company, and feels like it.
Having said that, I had a Bloomberg Terminal subscription for five years. I friggin' loved it. Regardless of whatever might be behind the screen, it's still an amazing, powerful, gorgeous tool. I miss my B-Unit, oh so badly.
Not sure what you're talking about. The "underground" lower level is all conference / presentation rooms. No one works there. The simulated sunlight you're referring to is a large art piece installation in the building and the creator chose to simulate light as he saw it during a particular sunset (in Arizona IIRC? It's been a while...)
Check out the new SF office we opened up if you're looking for a more West coast feel.
The article seems to assume that there is something "wrong" with running on 25M lines oF Fortran. It does not provide any technical reason why Fortran is not the right tool for the job. Why does this need to be "Web 2.0"?
I'm also confused about this. I don't understand what is wrong with Fortran. In school we learned Fortran 2003 for numeric simulations etc. For my master thesis I programmed in C and there were a lot of things in C that I missed from Fortran.
1) Fortran is easy to learn, no need to know pointers etc
2) very math friendly, easy to extract subarrays/submatrices etc, no need to pass around array lengths etc. matrix's are really easy to deal with. I remember in C we ended up rewriting all two dimensional arrays (matrices) to column major arrays cause it was a pain to deal with (or maybe we were noobs)
3) useful built in functions such as matrix multiplications, in C you end up writing a lot of functions to do what seem like very simple things
4) Great math libraries such as BLAS/LAPACK
5) Seems to me like it less error prone than C (spend a lot less time debugging)
This is just my experience with it. Somewhat to my surprise I really ended up liking Fortran, and the new code doesn't look at all like the old code. For what we did it was also slightly faster than C. If I were to write a numeric program I wouldn't think twice about what language to use. I feel like Fortran is an undervalued language for calculations. But then again, I am by no means an expert.
> Now, if Bloomberg's code is a spaghetti monster (I have no idea), that is a problem. But that is a problem of legacy code, not a 'Fortran' problem.
This I think is the key point: lots of time when I've seen people mistake a practices problem like this (that result in expensive-to-maintain "legacy" code) with implementation language problems, they end up doing a conversion to a new language (often doing the worst thing you can do with a poorly specified, poorly documented, but basically working codebase, a "big-bang" replacement rather than an incremental one) without fixing the process issues, and so end up creating a new poorly-specified, poorly-documented, often less-well-working "instant legacy" codebase that just happens to be in a currently-fashionable language.
I client of mine still has a fairly large piece Delphi of code in production and I must say it's a bit of a liability for them. An example of a technical problem they face is getting the particular version of Delphi they use work with TLS (and that's becoming a problem now as everyone's starting to disable SSLv2 & SSLv3). The fact is that they simply wouldn't face this kind of problems if they have ported their code to something like Python, C# or Java.
On the non-technical side of things, they also struggle with hiring developers who are capable of (or willing to) work with Delphi, especially junior developers. A significant portion of their staff also doesn't know how to work with Delphi, so reassigning resources to match the needs of the business is often a challenge.
I imagine these issues would probably apply to Fortran as well.
Fortran is entirely modern. Last standard is what, 2008? And any compiler will just eat any previous version without complaint. You can take a huge swath of '77 code, put some 2003 and 2008 in there, and it will all just compile.
Python? Good luck. Every frigging month (practically) it's a breaking change. You have to work furiously just to keep your infrastructure working, or purposefully draw a line in the sand (2.7!).
I run Fortran every day (via NumPy and other packages), and some of us are investigating just switching to it for some projects (we need the math speed, and all the C++ matrix libraries involve compromises of one form or another). There's no reason I and my peers wont be doing the same in 50 years. It works, its fast, its debugged, and it is easy to use things like Python 5000 (by then over 70% of people will have made the transition from 2.7 to the 3.x version!!!) to do all the glue work that Fortran is not optimized to do.
Fortran codes ain't ever going to be a liability[1]. Python 2.7 on the other hand...
[1] because of being Fortran. Poor coding style was endemic in Fortran77, and that can be crippling, but to a large extent that is language agnostic. I acknowledge the required short variable names of 77 doesn't help, but people would not comment what the variables mean, and that is the real problem (IMO). Poor structure is another problem (masses of gotos). Yes, 77 doesn't make your life easy there, but discipline is possible. We can quibble about all this; my point is the 'modern' languages are going to be old quite soon, and we will face real porting problems then. Move 20,000,000 lines of JS and Python? Good luck.... Get a Python 2.4 code base working with 3.9? Ah, hmm, well .....
But Fortran? We build it and run it every day, easy peasy.
Python? Good luck. Every frigging month (practically) it's a breaking change. You have to work furiously just to keep your infrastructure working, or purposefully draw a line in the sand (2.7!).
What? There was exactly one breaking point, from Python 2 to Python 3. Which "monthly breaking changes" are you talking about?
Probably not the language itself, but the quivering mass of libraries with moving APIs and bolted-on design fads. Numerics is numerics--functions, matrices, linear algebra and indexing isn't going to change any time soon.
That Fortran gets bashed and Lisp is worshiped seems quite incongruous.
I'm not a Python developer but what I've seen and find frustrating is the slow migration of libraries to Python 3. Some projects have switched to 3 and some are sticking with 2 for now. Some applications are using 2 because they have dependencies using 2, but as those dependencies switch applications have to deal with combining Python 2 and Python 3 dependencies.
I know each issue is easy in isolation, but we're talking about old 25M line codebases here. If such things exist in Python it can be a maintenance problem.
I'm in favor of languages making breaking changes occasionally but there is something to be said for languages like Java or Fortran that are intended to keep compatibility forever.
"There was exactly one breaking point, from Python 2 to Python 3"
One breaking point?
The majority of the libraries that I use don't even work properly in Python 3. The standard Mysql library for instance, which is used in the majority of ORMs (including SQLalchemy) isn't included in Python 3, which means you need to make major changes/try to use a different library.
The case of many standard libraries were changed, breaking lots of code. Example: ConfigParser to configparser.
Even simple things like print 'test' is now print('test')
These are only a few examples. There are lots of other major changes that break 2.X code.
I have been using Python 2.X for all of my projects and wanted so badly to change to 3.x for my new projects. However, because of all incompatibilities, I can't.
I don't think I'm alone on this either. Python 3.X still isn't being used by the majority of developers.
(we need the math speed, and all the C++ matrix libraries involve compromises of one form or another).
Are you using lots of really tiny matrices? I would think pretty much all the libraries could just be linked with a good BLAS distro and then it would not matter what language that you use. Personally, I'm a big fan of the Armadillo C++ library.
Delphi could be considered a proprietary language while Fortran has multiple standards. So tooling switches are not quite as big an issue.
Lifecycle of languages is a hard issue regardless of language. Unless you keep switching to the correct new shiny thing then you will have developer issues. I think their are a few more Fortran programmers than Delphi these days.
How is Python even being mentioned as a replacement for Fortran? That is like replacing C code with Python just because ...
Sorry I really love Python but I see it as a slow weighing beast that is always a good second best choice for everything it does. Great if you only know Python or only want to know Python but for production? Speed issues is the main reason why Python is not more widely used.
> How is Python even being mentioned as a replacement for Fortran?
Its probably being mentioned as such because a lot of the reason people use Fortran is because of key high-performance libraries for particular numeric application domains -- not because Fortran is necessarily the best language, aside from the existence of the those libraries, to express higher-level solutions in those domains. And because Python has convenient wrappers for those libraries, such that where that is the motivation for using Fortran, Python is a reasonable alternative.
> Speed issues is the main reason why Python is not more widely used.
Sure, and that's why you wouldn't (with the current implementations available) want to write the low-level numeric routines underlying NumPy in Python.
OTOH, once you have NumPy, a lot of time Python makes perfect sense as the language to use for higher-level solutions that rely on the functionality provided by those libraries.
The slowdown from using Numpy (or Octave or whatever) instead of Fortran can be anywhere from 100× to 1.0000000001×. It depends almost entirely on how much of your compute time is spent inside standard computations on large arrays. If your arrays are tiny, or you're doing irregular things to them and can't figure out how to vectorize them, figure 100×. But there are lots of Numpy programs that fire off a few dozen Numpy calls per second, so the bulk of the code executed is Fortran (or maybe C) in any case. The extra microsecond to interpret the Python bytecodes to fire off the call just isn't significant.
More typical is about 3× or 5×, due to iterating over the data many times (once per Numpy call) instead of once, thus bottlenecking on main-memory bandwidth.
Personally, I like to learn new languages, so I might be fairly biased.
But about 18 months ago, I took over maintenance of a small-ish Delphi application when the original developer left the company. It's not a super-fun language for me, but to me it was fairly easy to get into.
I think a lot of people assume Fortran == FORTRAN 77, which is a terrible language without even full support for structured programming and with an annoying fixed-field syntax. There are good reasons to hate on it.
Most Fortran hate is actually F77 hate.
Fortran 90, on the other hand, is basically C with a different syntax. F90 (and newer: F95, F03, F08) really isn't a worse language than C, and there's no reason to hate on it when you wouldn't hate on C, but a lot of people don't know it exists because they equate Fortran with F77.
It seems like you're purposely misunderstanding the point of the article to fit your worldview. This is less about about the new hip things the kids want and more about running a business.
Do you know FORTRAN? Do you know any programmers who know FORTRAN? Do you know anyone who wants to learn FORTRAN? Unless you're in the community the answer to all of those is no. And I can't imagine it's a big or growing community. The obvious consequence of this is FORTRAN will become more and more expensive to run a business on as time goes by because fewer and fewer people will have experience with it or any desire to use it. It's going to be harder to find people to port it and even harder to find people to maintain it.
As to it becoming "more and more expensive to run it", that kind of depends. Most of that Fortran code isn't touched very often. The cost of swapping it out is huge.
Swapping it out piecemeal isn't as expensive money-wise, but it carries large risk. Should it be ported out? Sure. Can it be, in any reasonable way? That's a harder question, and it's not as cut-and-dry as you make it sound.
> Unless you're in the community the answer to all of those is no.
Fortran is still widely used in mathematics and science. So if that's the community bubble you're referring to, then yes, there's definitely a niche there in my experience (but not purely for legacy reasons).
I've got a couple of friends who work at pretty big SAS shops. Non of them knew SAS before being hired. Their employers didn't care about their (lack of) SAS skills and no SAS questions where asked at the interview. They simply hired them and sent them on a couple of 3-4 day SAS training courses and expected them to pick it up.
Are they statisticians by any chance? Their domain knowledge would be much more important than programming experience. Conversely, for a programmer experienced in a classical PL (FORTRAN, LISP, C, Java), SAS is quite a shock. Data-oriented computation models (like SAS has) are not so common any more.
Not really I write a lot of SAS (Engineer at an Industrial plant) it is what we use to query all our production databases and to write Reports for operators.
If you know SQL or anything about databases SAS is very easy to pick up. My background from University was Java and C I had few difficulties.
My House-mate is a statistician he works in finance and uses R my understanding is R is a bit more difficult as it doesn't have anything like SAS's Proc SQL never used it myself though.
Coincidently Fortran is used very heavily in models here as well.
There are idiomatic ways to use every language. Much in the same way that you can write C++ that looks a lot like C, you can write FORTRAN that looks almost exactly like C. So if you have a lot of FORTRAN and thousands of C programmers, it isn't a huge problem. If your codebase was in COBOL, maybe you'd have a problem...
It's a pretty great thing viewed through another lens, you hire kids and teach them FORTRAN. They won't jump ship as easily when their work experience is all FORTRAN.
The problem is that the code base is byzantine with developers hired who don't have good Fortran knowledge.
There are a million different versions of the same function each with a minor tweak to get the "right" behaviors labeled func_name_1 down to func_name_1000000 because everyone is too scared to fix the original function as they don't know what depends on it.
It's not that Fortran is the right tool for the job it's that it has terminal momentum.
> The problem is that the code base is byzantine with developers hired who don't have good Fortran knowledge.
> There are a million different versions of the same function each with a minor tweak to get the "right" behaviors labeled func_name_1 down to func_name_1000000 because everyone is too scared to fix the original function as they don't know what depends on it.
Sounds like a problem with development practices, documentation, and overall code lifecycle management -- and also employee management in terms of skill development priorities.
Those issues aren't particularly tied to implementation language.
It also depends very much on which Fortran we're talking about. Fortran has evolved more than just about any other language out there, and Fortran77 (not a very good language) is very different from Fortran08 (a pretty great language).
Exactly right. I first coded in Fortran in the mid 80s on a mech eng code base written in Fortran 66. Fortran 77 was avoided because it was too new. In the early 90s I worked in reservoir engineering where we used 77. Fortran was the choice for number crunching, especially on hardware like Crays that used vector processors that could be exploited by specialised compilers. Those early Fortrans had very little in the way of abstraction. Data was passed around in 'common blocks', and there was no dynamic memory allocation. Remember that Bloomberg started in the 80s. So it's likely that they have a huge Fortran codebase in one of these old variants of the language.
Most people will point out the support issue, namely getting knowledgeable people to work on it might be difficult. In the case of the code I have worked on while we may rewrite a working system in a newer language it is more typical to wrap known bullet proof functions in layers to present it in new formats; web/mobile/etc.
Most people who know C can pick up Fortran in a few days. I am not a Fortran guru, but have learned enough to write a lot of code for a few scientific projects in school. I really don't understand the mindset that you "need Fortran programers to write Fortran".
Fortran code isn't inherently technical debt, any more than code in any other language is.
> Do you still have a VCR? Isn't it a bad idea to assume VHS is a good medium to archive?
I fail to see the relevance of the analogy; its not exactly as if there is a shortage of -- F/OSS and commercial -- Fortran toolchains maintained and available.
Keeping Fortran code is technical debt in a similar way that keeping your VHS videos is technical debt.
Sure, you may be able to find tools to use Fortran in the near term. But how do you protect the future. Does it play well with emerging systems/protocols? Do you have developers to maintain it? Is there a clear path to upgrade to newer platforms and frameworks?
Two reasons are maintenance and hiring. How many talented Fortran developers are out there now (and living in NYC) available to fix and maintain code that your entire business depends on?
My own take on that was: "Wow, they surely must do a lot of performance sensitive stuff to use the fastest language available for number manipulation. Kudos to them."
I'd argue two points here: Recruiting and technical debt.
There may be challenges getting the talent you want with legacy systems and I believe you're building a technical debt as you'll hit a wall with your infrastructure and have to upgrade/migrate one day anyway.
The top comments on this posting are ex-employees talking about how they've been migrating many (most?) parts of the system to C++, so yeah, even Bloomberg management agrees staying on FORTRAN is probably a bad idea.
I don't remember the exact figures but 25m lines of Fortran seems plausible, but it was still the minority of the code base. The majority at the time was in C and C++, with probably a few million lines of Perl and Javascript as well.
Plenty of the Fortran dates from the 80s, rewriting it in a modern language would be a huge project which lots of risks and limited upside. No significant new fortran has been written for a long long time (>10 years).
It's also a mistake to think of Bloomberg as a single piece of software, it's closer to an app store, there's are thousands of specialist apps (functions in Bloomberg terminology) and most users only use half-a-dozen, but which half-a-dozen varies significantly between users.
This makes it very hard to displace wholesale. There are individual pieces a competitor can go after but identifying a subset of functionality that's enough to convince users to switch is hard.
There's also a lot of stickiness which goes beyond pure functionality. Network effect and brand are also key parts (having a Bloomberg Terminal is a status symbol).
Is any of the ticker plant code behind the dealer quote pages in Fortran? Or any of the auto-ex stuff? Curious, as that's the stuff I had to plug in to. Via an ION gateway, naturally...
I have been looking at some projects at Bloomberg from the outside and even those the projects are interesting I don't think I'd want to work their because of bureaucracy and inertia -- it's the kind of place where you'd need to get approval to install a text editor on your machine.
I have found that often "high profits" get in the way of customer service because they are a disincentive to "quality is free" thinking and make it possible to sustain the unsustainable for way too long.
To take an example, Cable companies are so profitable that they think nothing of the cost of replacing cable boxes that break, or of excessive truck rolls. These not only cost money but they anger consumers. My mother-in-law quit cable in disgust and switched to OTA TV after she had three cable boxes burn out in three months and would have to go stand in an long line to return it and pick up a new one.
Quality is free and screwing up is expensive -- even if you can afford to screw up it costs you customers and it costs you employees. But again, turnover is no problem if you are making enough money you can afford to spend 5x what things should really cost.
Sure I have. That’s why I said "something" instead of "a lot". They were investigating C++ in 2006 and I've actually talked to a C++ developer from Bloomberg so something has changed. Just search their job listing:
I still feel Fortran gets a bad rap. Fortran has been a work horse and is extremly fast. (Mind you I learned Assembly on a C64 so Fortran looked so much "nicer" back in the 80s and I don't program in it at all but it seems like everyone treats it like it is some mothballed slow poke.
BBG's competitive advantage is not data - in fact, it's pretty mediocre. Any reasonably large player sources their own data feeds due to too many holes in the former.
BBG's killer features are support and _chat_ - chat with other _trusted_ counterparties who also paid the admission price.
It would have been interesting to know if they are using Fortran 95 or the more recent (and very modern) 2003. People who like to beat on Fortran rarely know that it has evolved quite a bit since '66 and '77
I worked at bberg for 4 years. Changed about 2 lines of fortran code. Some people do more, others so less, but the norm is you'll spend a sliver of your time in fortran.
I have to disagree with having a company actually coming in and taking out everything that bloomberg has been building the past years.
This is an extremely hard market to get it. Bloomberg is extremely dedicated at what they do and they do not take competition lightly. They do use there massive power to make the life of competition as hard as they can.
As for Bloomberg there terminal and all of the services that they provide is top notch. I have a lot of friction with all of there data, and this is by far the best i have worked with.
Unhappy academic here - codebases with millions of lines of Fortran are very prevalent here as well. It is both better and worse than a lot of people think; like a lot of the other comments said, that code is rarely touched, but when you need to touching it can be very painful. Modifying old Fortran is rough, mostly because of that awful, awful IMPLICIT keyword. Though I will say that the native multidimensional array support does make certain sections much easier to read than corresponding vanilla C code.
Bloomberg terminal and Thomson Reuters are certainly ripe for strong competitors.They are basically operated as monopoly for decades.Yc,developers and others should take on this challenge.
http://www.nytimes.com/2015/09/10/business/dealbook/the-bloo...
> Its not like you can throw together some trading system with an eventually consistent datastore in the back. Loosing a couple of trades without an audit trail crashes markets.
That's not consistency, though, that's durability. Consistency in banking is basically the part where you don't allow people to spend money twice. That can be "eventual" and violating it can be OK under some circumstances, like when your bank issues you an overdraft (and applies a fee) or when your airline sells too many seats.
So the eventually consistent data store in the back is fine... as long as it's durable and as long as you're willing to invest man-years of effort in understanding the implications of your selected consistency model so that you don't screw it up and can perform operations which are compliant with some set of business rules that the business understands and approves of, naturally.
A trading platform must have a 1:1 mapping of buyer to assets when trades are agreed. You cannot have a situation where two people buy the same share, as it requires unwinding. A trade must be atomic.
traders want to see all the bids/asks of the whole market. Otherwise you'll effectively have a random assortment of tiny markets under one supposedly united trading platform.
Thats the other thing, trades need to be reversible. There is a good correlation between massive trade volumes and the need to reverse transaction. Having an eventually consistent model is just not going to work.
Being overdrawn is not eventual consistency. The act of an overdraft is not a failure of a bank to properly account for your balance, its a deliberate business move to generate cash.
So no, its consistency, one price, one bid, one ask.
The concept of overdrafting is a combination of modern business rule and ancient technological limitation (dating back to before-computers when accounting was manual and checks and checkbooks were also big deal, which his hella-eventually consistent). It can still serve as a useful metaphor, regardless.
But if you care about operating the EXCHANGE ITSELF then your allusion to the audit trail is a bit of a red herring, isn't it? In fact, the data store itself becomes pretty irrelevant for the live trade-matching - presumably you shard just the hell out of your data, keep the working set in RAM, keep most of your stuff colocated in the same place and just solve for availability by doing something crazy and hardware-intensive like having hot spares for everything.
And support. Being able to pick up the phone any time at quickly get someone on the other end who really knows their shit is a big thing you're paying for.
First it's Thomson Reuters. Second, YC has little chance in this market which requires large capital for the data feeds, and has enormous incumbent advantages. Third, FactSet is a player there, it's not even a duopoly.
There is no such thing as "incumbent advantages" for market this is based on technology that could be duplicated.The same is applicable to payment system that paypal,stripe and others take on now.Same with car industry that Telsa is working on.Did you read the linked article about MONEY.NET.The thing is that developers have not been paying much attention to nuts and bolts of these terminals,until lately.http://fortune.com/2014/03/20/can-the-bloomberg-terminal-be-...
Your better & duplicated technology is generally no match for the incumbent's connection, trust, familiarity, and fully expense-accounted sales team. That is the incumbent advantage and why Oracle wins so often.
Its very much like message boards. HN and Reddit have the hot hand and the connections overcome better technology of other boards. Once the connections become too toxic, people will migrate to the next thing. Then you might have a shot, but the group with the best connections will probably be the next thing.
No incumbent advantages? Not even an advantage associated with the reduced financial risk of operating a money-making business in the space? No name recognition? No established working relationships between Sales and Important Customers? No firm-specific human capital in the form of all the little things their employees know about how business in this sector works?
I recently read about this company in the NYTimes which is trying to do just that: https://www.money.net/
$99/mo/user vs $21,000/year/terminal. Wow! I wonder what other kinds of entrenched, expensive products like the Bloomberg Terminal are in need of competition. I think this requires industry-specific experience to know, however.
Really interesting...this is the first I've heard of them (I'm nowhere near the finance industry). I wish finance companies weren't so tight lipped, I can't find anything about their tech stack.
YC style companies are great for consumer software, but BB is directed at financial companies, which are very conservative and put a lot of value in support. There is no way to win in that space by being cheap, you need a lot of investment to provide mature code and high-quality support by real people.
By comparison, this is 33% more LoC than a BWR/PWR nuclear reactor (Monte Carlo method / quadruple integral (⨌) equivalent) simulator product formerly known as CORETRAN-01.
Finite element analysis modeling is really hard, even for nuclear engineers whom can code.
A lot of scientific control system apps get built with LabVIEW or Agilent VEE. If it makes beyond the niche consultancy stage, such things are often (re)written in whatever an engineer happens to know: C, Python, Haskell, LISP, [favorite language religion here].
Edit: BTW, just came across a supposed successor to VMEbus [0] (a standard industrial data bus): VXIbus [1]
When the compiler code size is a fraction of the whole it would actually make sense to improve on it instead of rewriting everything. Like Facebook did for PHP and their HHVM. Obviously Fortran doesn't have a speed problem but it could be improved in other ways like static analysis and nicer syntax notations.
Converting to later versions of the Fortran standard would get you a long ways there. Plus with all the focus in 2003 & 2008 with C interoperability, you can plug in later open source libraries.
[edit: I wonder what a modern version of RATFOR would look like]
In Fortran one can add two matrices: A=B+C. Try it in C++ (NumPy is better that way, but still this is an extension to a rocky foundation). The amount of freely available high quality math/statistical libraries for Fortran is unmatched. Why in the world would you give up on this just because you can use Google account? Google Payments for settlement?? WTF? Settlement is a bit more then credit card charge or moving money between checking accounts. Google Fiance as a ticker plant?? Yea, right. Even Yahoo provides options data, Google has just basic stock prices. Compare it with BBG that has everything. Build terminal in Web Kit? Why???
Every time I see suggestions like this, I remember minions, rushing from one framework to another - "koonga la mala makuna, koonga!".
I don't know how comparable this is, but JaneStreet seem to be quite happy with OCaml. I for one would give that some consideration before actually going C++. And that's ignoring the question why they would depart from Fortran in the first place.
People would be really surprised how much of the critical infrastructure of this country in government and enterprise runs on technologies around half a century old. This is nothing new. The IRS still uses millions of lines of COBOL to process our tax returns, the VA uses millions of lines of MUMPS (M) to store and process health and benefits records, and the list goes on and on. It's not going anywhere soon, either.
People skilled in these legacy languages and technology stacks are also amongst the highest paid and most in-demand in the country, at least here on the east coast in places like New York and DC.
Bloomberg can write a one page Python flask/bottle wrapper around any of that Fortran and serve it up as a microservice. They got Pang Ko from Mathworks who is one of the top parallel computing data structure minds in the world. I'm betting Bloomberg will be around until another player like JP disrupts the market with a truely distributed trading platform.
Just read the article. Yeah, a Bloomberg terminal does a lot (a lot alot) more than scroll "news tickers, checking Bloomberg email, and trading." When the author suggests that Google might make a play for market share (or some other Web 2.0 player), it is clear that there is an enormous lack of understanding as to exactly what Bloomberg is, as a product.
I didn't notice the "2006" in the title until I read that the terminal cost a grand a month! I thought something was off!! Bloomberg Terminal costs more like $2k/month per user nowadays... maybe more??
Quite Google has never been about exactness if 1 in 10k searches serves a bad result - they dont care.
And I have written mission critical Fortran software for big telcos' and tust me nothing contreates the mind when one on Vint's reports nudges you and says this had better be right or we are both looking for anew job.
edit: Can mods put (2006) in the title? Seems reasonable...