The article is an interesting history lesson, but that sub-heading at the top sort of threw me for a loop, referring to Microsoft as "the first big tech company"
Microsoft was, and still is, a giant tech company starting in the 90's and on. But if one looks at the digital corporate landscape back in the 1980s, IBM was an absolute behemoth. Consider this. In 1985, IBM had a market cap of $95 billion. That was more than the next 13 largest tech companies at the time, combined! This other group includes Apple, HP, Hitachi, Intel, Motorola, Sony, AT&T, Texas Instruments, Verizon, Xerox, National Semiconductor, DEC, and Control Data.
It's hard to overstate how powerful IBM was at its zenith.
I only know this because I was curious and looked up the data the other day, and it blew me away.
The word "tech" itself changes every decade or so. Agriculture was considered "tech" at one point of time. The companies you listed are all "electronics tech" while Microsoft is purely "software tech" which is the prevalent definition of tech today.
IBM was also a software giant. People railed against IBM as they later did against Microsoft: the big player in town, which monopolized everything and whose evil hands seemed everywhere. Likewise, managers felt safe with IBM (like they later did with Microsoft): "nobody was ever fired for buying from IBM" was a well known saying.
IBM was the omnipresent giant before Microsoft became one. It's really hard to overstate this.
I'm not sure if this is intentionally obtuse, but yes "tech" as a slang term doesn't mean "technology", it describes a type of company largely known for the prowess and scale of their user-facing software. This includes such entities as Facebook, Google, Apple, Microsoft, Netflix, and Amazon, but is not inclusive of course.
Tesla, on the other hand, clearly produces technological artifacts - quite impressive ones, at that - but they are not a "tech" company within the colloquial context you've stumbled into
> I'm not sure if this is intentionally obtuse, but yes "tech" as a slang term doesn't mean "technology", it describes a type of company largely known for the prowess and scale of their user-facing software. This includes such entities as Facebook, Google, Apple, Microsoft, Netflix, and Amazon, but is not inclusive of course.
Except I would argue that Apple is known more for its hardware than its software.
Also, I believe most people don't give a second thought to Netflix's software. It's the content they think about first.
It must be very context specific; I'm in IT, in an English-speaking country (Toronto,Canada), and don't even remotely think of "Tech" as "front-end user-facing software only". :-/
What kind of companies do you call "tech" companies then? I can't really understand the objection I'm facing, when people tell you their stock portfolio is heavy in tech do you really think companies other than the one I listed? I live in the Bay Area so maybe things are skewed to "local" companies but I held a similar opinion when I was still in Canada too...
EDIT: Just to hammer it home, search "is faang tech" and weep that "corporatefinanceinstitute.com" is on my side
of course faang is tech. Your assertion was not "faang is tech" it was "faang is all that tech consists of". These are not equivalent, and I recommend you view [1].
That's exactly the distinction I was making. Do you call Ericsson a tech company? I certainly never hear them in that bundle.
Again, "tech" != "technology" in this domain in my view. It's the double-great-gp or so that says it, but "tech" in my view leans towards companies with humungous software outreach and data ownership, so clearly FAANG but also Microsoft, Palantir, data warehouses, that's the kind of thing I put in this "tech" bucket. It's a type of company that probably knows more than it should about a lot of things.
And again, this isn't the "Wikipedia" definition (which itself has context-based caveats...), but my opinion and experiences involve this being the sphere of companies one calls a tech company today. I apologize for my inflammatory tone in the other comment and hope I have explained my point, even if you hold a different experience
I probably would call Ericsson a tech company honestly, though that would be a little iffy.
However, I definitely would call a hardware company (Dell, HP, Lenovo, etc) a tech company, and their software all sucks.
That being said, you have certainly made your point, and with your definition here I can see why you would think that way, even if I disagree. In addition I agree with the sentiment "It's a type of company that probably knows more than it should about a lot of things." Though I don't necessarily think this defines "tech", or that tech is the correct label, I do think these should get there own label.
As for your other comment, eh no big deal. We all make comments more aggressive then we intend every once in a while.
EDIT: I do think there is a difference between "tech" and "big tech", I think that "tech" would be as I described before, and "big tech"/"big data" is what you were describing.
If “purely software tech” were the definition of tech today, that would leave out Apple - mostly a hardware company by revenue - and Amazon - half of which is a retail company shipping physical goods.
It's a good point about the definition of tech. I was trying to refer to digital tech companies. But certainly as you go back in time, tech applies to so many industries before the digital age. Automotive, photography, industrial equipment, aviation, chemical engineering, materials science, etc etc all could be thrown into the technology umbrella, so it's important to make a distinction.
It's only been a recent trend, and it's more to experiment with new form factors and nudge the PC ecosystem in right direction more than anything else.
And the recent trend is for software companies to start making their own hardware (AWS making Graviton, Echo products,and Google making TPUs, Pixel) while hardware companies move into software (Apple moving into Apple Music, TV, News).
Clearly Microsoft's focus is on software. But I think you'd be surprised how early they started producing hardware. Back in 1980 they had huge successes in hardware! Starting with the Z80 Softcard and Ramcard.
Exactly, there's not much recent about them doing hardware, but if you don't know the history and only look at the past 10 (or so) years which is in some ways understandable, then indeed it could look like e.g. the Surface line is 'MS recently doing hardware'
> It's only been a recent trend, and it's more to experiment with new form factors and nudge the PC ecosystem in right direction more than anything else.
> And the recent trend is for software companies to start making their own hardware (AWS making Graviton, Echo products,and Google making TPUs, Pixel) while hardware companies move into software (Apple moving into Apple Music, TV, News).
Define recent? The Xbox came out in 2001, and they were working on media players before then I believe.
In fiscal 1993, the company posted annual revenue of $7,977 million (~ 8 billion), with a year-over-year increase of 12.6%. After two years, in fiscal 1995, the company achieved $10 billion annual revenue milestone for the first time. However, after that, the annual revenue started declining till fiscal 2005 when it again crossed $10 billion mark. Apple generated $13,931 million ($13.9 billion) in fiscal 2005, registering 68% YoY growth.
> The new fonts had to mimic the established core set of PostScript fonts, which included Times Roman and Helvetica. Since Monotype had originally developed the Times fonts back in the hot-metal days, that part was easy. To compete with Helvetica, though, they chose to adapt an earlier Monotype design with similar characteristics, called Arial.
I’m really annoyed at this particular substitution. Yes, I understand that Linotype was being annoying, but the choice of Arial as a Helvetica replacement just grinds my gears. It’s a Helvetica replacement, all right: one worse in literally every way. It’s all the bad parts of Helvetica with none of the good ones. Helvetica is precise with its terminals; it’s styled with its graceful flourishes on letters like the capital “R”. Arial has none of that charm. But it has all the downsides of Helvetica: it’s really wide; it’s not the best for readability. Helvetica really is a pretty font, but Arial is utterly awful.
I’m just so sad it ended up being given such a foundational place in digital typography. I subconsciously wince I see a default Google Doc, or when someone says to “just use Arial” when Helvetica isn’t available. I really wished they had gone with something completely different instead of using a bad-looking, copycat font.
I quite like Helvetica Neue, it just "suits my eyes" and the shapes and proportions feel so natural to me. Your rant about Arial made me curious how it came about.
Ah, right, Helvetica Neue has been used for iOS and macOS for a number of years. That makes sense why it feels so familiar.
Yes, I see what you mean about the flourish on R. Same with G, and the tail on "a". I agree that Arial is inferior for pretty much every letter in the comparison.
---
Apple recently replaced Helvetica Neue with San Fransisco (the name is confusing to me, because they had a font of the same name years ago).
I am always amazed by so much effort and so much research in something that is often taken as granted and not really appreciated to its true value by most users (me included).
An absolutely insane amount of research, testing, and polishing went into Windows 3.11 and Windows 95, and I think the ratio of polish to code reached its peak for Microsoft.
Edit: Just to clarify, I am talking mainly of the UI here, not the underlying OS, which was, of course, very basic.
Although with virtualized hardware, my experience of running both Windows 95 and Windows ME have been quite good.
Also, although people often praise the architecture of the NT-based line, I think 3.x and the rest of the DOS-based Windows were far more impressive; while NT is very much a "traditional" OS, 3.x and 9x are in some ways far more similar to a bare-metal hypervisor.
This fact becomes very apparent when one tries to write drivers for the latter OSs, wherein you write a user-mode library that basically behaves as if it's accessing the hardware "directly" as an old DOS application would, and then write a "virtual device driver" to virtualise/multiplex the hardware among multiple "virtual machines".
There's admittedly far less protection than in the NT world where all applications' (and DOS ones are run in an emulator) access to hardware needs to pass through the kernel, but the thin and permissive virtualisation layer --- where applications can essentially access the hardware directly if not "trapped" by virtual device drivers --- makes for amazingly low resource usage and high performance. A DOS application running in 3.x/9x can interact directly with the hardware and the only overhead is that of V86 mode, comparable to the full x86 vitualisation extensions (AMD-V/VT-x) introduced much later.
As my other reply here notes, the thin (and permissive) virtualisation layer means that badly written applications could crash the system, but I blame that on the applications more than anything else; too bad that most people seemed to think MS was to blame, when DOS had even less protection. I used 98se as my daily OS until well into the Vista era (at which point I gradually switched into XP), and it was definitely not the "reboot multiple times a day" that some others have experienced, but then again, I also wasn't in the habit of running lots of low-quality applications either. Uptimes of over a month weren't uncommon, and were usually brought to an end more often than not due to myself doing something stupid with the application I was writing.
(In fact, I've experienced far more trouble of various sorts with a recent new machine that runs Win10.)
> The original version of Sim City was written for windows 3.x and included a bug that read memory that had been freed to the system. It worked in windows 3.x, even though it shouldn't, because that particular range of memory wasn't being used for anything else until the program was terminated.
> In beta versions of windows 95 Sim City didn't work because the operating system allocated memory differently, and Sim City would crash as expected because of the bug in the program. Amazingly, in the final version of Win95 the original Sim City worked. Microsoft engineers had actually tested backwards compatibility with Sim city, located the bug, and worked around it in their sourcecode.
Workarounds like that which detect a particular app running and enable a special mode are called a Shim, Microsoft created hundreds of them to ensure compatibility in new OS versions with popular apps.
Hmm, using Windows 98 in 2007? I had so many problems with Win98 compared to Win2k, I really have trouble understanding why you'd do this. Especially because you say you only ran high quality applications. High quality software got ported to NT relatively quickly IIRC.
While I also preferred Windows 2000 once available, most likely he was using Windows 98 SE, I remember it being quite reliable as Windows goes, especially compared to the nightmare that was ME.
On install of Windows 98 SE I would always turn off Active Desktop and the fancy looking information left-hand side pane in Windows Explorer.
Both used IE to work and would just make everything feel really slow and sluggish. It's also possible that them being disabled improved system stability.
Polished? Windows 3.1? It was a shell over DOS. Windows 95 introduced plug and play and a decent driver model.
Windows 2000 was peak MS. I could even make an argument for XP. It just didn’t hold up well in the Internet era because of all the security vulnerabilities.
This is common misconception, but in fact since Windows 3.x there was a kernel that did a lot of stuff but made it look like DOS was in control. This was in effect a virtual machine implementation, to make DOS apps behave correctly under multitasking while they were implemented as if owning the whole machine. To facilitate backwards compatibility with DOS drivers, Windows 95 emulated quite a bit of DOS kernel too, with some performance penalty. Raymond Chen's The Old New Thing blog has a number of postings to explain all this, this perhaps being the most informative: https://devblogs.microsoft.com/oldnewthing/20071224-00/?p=24...
I think it is sort of a continuum, with 95 being further on the path begun on Windows 3.x "386 Enhanced mode", or maybe it was like that in Windows/386 already (that was the branding for Windows 2.0). I have a feeling of reading a Raymond Chen blog post about how that worked, but my search skills fail me at this time. However, here's another site summarizing how that mode worked, allowing MS-DOS app multitasking among other things (32-bit disk I/O!): https://networkencyclopedia.com/windows-3-1/
It's also fascinating how many people back then had almost no background in computer science. Many had al sorts of weird backgrounds, they just found computers interesting and feel into different roles and learn everything they need on their own.
I've been using the new Cascadia font that Windows released with their new terminal for a bit and I can say I enjoy it. It's playful.
I don't use color schemes and have a white background, so this typeface provides all the joy I need/want. I don't mean this as anything more than an opinion of course. It's a matter of taste.
How can you find out what it is called? The inspector in Chrome shows the highest preference font as LAText, but I can't find any good results when I search for that.
Fantastic read, there is sufficient material for someone to author a sizable book covering the history of digital type in computing and how it came to be standardized. Digital type is such a critical part of our computing experience but so deep under the covers that is does not get any attention.
Reminds me of Jobs' commencement speech at Stanford, he attributes the explosion of fonts to his previous studies. The bit about connecting the dots looking backwards.
Microsoft was, and still is, a giant tech company starting in the 90's and on. But if one looks at the digital corporate landscape back in the 1980s, IBM was an absolute behemoth. Consider this. In 1985, IBM had a market cap of $95 billion. That was more than the next 13 largest tech companies at the time, combined! This other group includes Apple, HP, Hitachi, Intel, Motorola, Sony, AT&T, Texas Instruments, Verizon, Xerox, National Semiconductor, DEC, and Control Data.
It's hard to overstate how powerful IBM was at its zenith.
I only know this because I was curious and looked up the data the other day, and it blew me away.