> Don't use analytics at all but focus on your product.
Huh? Analytics is how you focus on your product.
By instrumenting your product with analytics, you can find out if customers are using your new useful feature or if they can't find it. If they're performing a task quickly because it's easy, or slowly because they're struggling with the UX. And you find out that customers on a certain mobile device are suffering huge performance issues, for example.
You don't know these things until you measure them. That's analytics.
Obviously analytics are only one piece of product improvement -- there's sitting down with users for 30 minutes to watch them use the product, interviews, surveys, etc.
But analytics are a critical piece. You can't focus on the product without analytics.
Analytics isn't just about conversion. Analytics is about the entire product experience.
> Huh? Analytics is how you focus on your product.
For the first decade of my career, I really believed this, and spent a lot of time doing split tests, and studying analytics, and trying to "make things better" by understanding the numbers as they were given to me.
This was a mistake.
I've since learned that these numbers will rarely help me make meaningful improvements to my product, and my business. Sure, they can be useful so that I'm not "running blind," but they simply aren't going to show me how to create an ingenious idea that takes things to the next level.
Analytics will help you optimize things to a "local maximum", but they'll blind you to the real possibilities of creating something new that can completely transform a business. As soon as I understood this distinction, I've been quite a lot more effective.
There's a similar problem with things like "user interviews." A common pitfall is to ask people which features they want. That has limited use. The real work you need to do is the "creative thinking" that others haven't done. Figure out what people don't know they want; learn what the numbers can't tell you. Then go and build it. Yes, understand the numbers, choose a good business model, and optimize based on those numbers, but don't let the numbers create the product. It's a dead end.
> but they simply aren't going to show me how to create an ingenious idea that takes things to the next level.
Of course not. There's no substitute for straight-up creativity and deep thinking.
But once you have your ingenious idea, you still have to design it, make sure it's clear to users, that they find it and can use it effectively. Your "ingenious idea" may turn out to be largely sabotaged if a button you thought had an intuitive label is misunderstood by 90% of users, or a link you thought was highly visible is being scrolled past by nearly everyone.
Yes, analytics is all about optimizing things to a local maximum. But you might not be anywhere near your local maximum. It's astonishingly easy for the first version of your ingenious idea to only be achieving 5% or 10% of the actual local maximum potential. We shouldn't downplay the difficulty or achievement involved in getting even close to a local maxima.
And you're correct that in user interviews, if you only ask what features they want, you're drastically limiting the value you might uncover. On the other hand, you'd better not ignore the features users are frequently requesting either. A lot of users are pretty smart and know exactly what they need, at least to get to that local maxima.
If you’re just using analytics to look at how effective UI designs are in making business conversions, couldn’t you still measure that by checking the backend and looking for a spike in activity towards the API endpoint that the UI invokes? Couldn’t you measure effectivity with a spike or drop in sales? I mean, good UX doesn’t so much rely on Google Analytics but on a UX engineer’s depth of knowledge about human psychology.
> couldn’t you still measure that by checking the backend and looking for a spike in activity towards the API endpoint that the UI invokes?
You could have multiple UIs hitting the same endpoint. Also, why limit yourself with such crude metrics?
> good UX doesn’t so much rely on Google Analytics but on a UX engineer’s depth of knowledge about human psychology
UX in theory, and UX in application are two different things. You could have the best models of how users will interact with your site, but until you deploy and measure, you have no idea what will happen.
You can still parameterizethe API calls if you want to attribute user activity to a specific flow, and that way you wouldn’t be “feeding the beast” that is GA.
How you mark/report the events is different from where you report them. You could use any one of self-hosted solutions on your own domain instead of GA without changing much the way you report back.
There's plenty of alternatives to Google Analytics, including open-source software you can self-host so it doesn't share your users' private data with a third party. You don't need to roll your own just to avoid GA.
If you are unable to find out that 90 percent of people can't use a major feature that you thought would differentiate your product, you are missing all forms of feedback, including much more important sources besides analytics.
I disagree completely because I've seen it in practice.
A button turns out to be below the fold on common small screen sizes that a new designer forgot to consider. A bad translation results in 90% of users in a particular country misunderstanding something. A JavaScript library doesn't work on a common Android phone you don't test with. Latency issues make something virtually unusable from the other side of the country because of a single badly written function that's easy to rewrite -- but you still have to catch it!
You need ALL forms of feedback. It's not a question of some being more important than others. They're all important and play their own unique roles. Analytics is for catching AND debugging all the things that go wrong at scale in the real world, as opposed to the artificial and limited environments used for user testing.
One goalpost at a time please. 90% of users in "a given country" is not the same things as saying 90 percent of all users. Unless that country is so important that it's everyone, and again you needed to test that your product is usable in the first place.
And even so, while there are a lot of countries and languages, a collection of tests in all of them, which is all it takes for these examples, is not "big data". Again you are crediting analytics for things that people were fully capable of and responsible for doing in the days without traffic data. I realize it is great for the resume and sounds sexier to do "analytics" as opposed to just basic software testing (you should see what people are calling "AI" in other industries these days). And I'm certainly happy to agree that analytics has some value, say in improving wording and stuff that a single instance won't tell you, but again these aren't cases of that.
If you see in your backend metrics, that a button isn't clicked by a huge number of visitors you can start there and analyze. Common screen sizes are helpful, but more helpful is to look at the page since maybe other optimisations can be made, if that is the important button.
Similarly with languages, that you cans er from backend metrics.
More data seems always nice, but analyzing more data isn't making analysis simpler and there is a big privacy impact in analytics, especially when outsourcing to data collectors, who can gather data across sites. (Which then also gives Google information which services are interesting to user and can be integrated into search etc.)
> if you are unable to find out that 90 percent of people can't use a major feature
How would the users know they can't use a feature they don't know it exists?
Let's say you add a brilliant new feature X, but due to a bug the users can't load the code for that feature so they never see it. How would they know to submit a form feedback for a feature they don't know it's there?
Because when you're talking to them directly you ask them about it. This is what I mean by all forms of feedback.
I'm getting the impression that people are just ignoring all advice about communicating with customers in their startups and just throwing stuff out there to see what sticks. Besides being wasteful in doing stuff no one wants anyway, what if that bad feature crippled your product and your paying customers have permanently switched to a competitor the instant your change frustrated them? And now you're bankrupt and can't afford analytics. Relying on analytics as a crutch to catch these things was the mistake in the first place.
Fun story I actually had forgotten about till now: I briefly worked at a tiny startup out of my school in the ending days of the internet bubble, trying to sell a "data mining" software product. Way too soon before it was cool, sadly. It was really hard to make the case that people needed to pay us $100k for the benefits we could get from their data. And companies certainly weren't going to go for a pitch like "if you launch a broken product we will catch that fact". They spend a lot of money to be sure that doesn't happen already. We even had one major customer figure out that they could just have an engineer perform a simple counter over incoming communications that would catch all they needed to know, and hence they didn't need our product anymore. That was kind of the end for us, in fact.
You're essentially arguing for qualitative data instead of quantitative, but both together is usually where the money is. I agree that qualitative analytics are underestimatd because they're hard to do, but I also think that having quantitative analytics together with qualitative allows you to contextualize your numbers in ways that lead to insights you wouldn't have otherwise.
Also, after you've already reached product-market fit, it's important to take your product to its "local maximum".
I think you may be right; after all, thinking critically about my story above, I did spend quite a lot of time learning about analytics and quantitative numbers. It could be this gave me an intuitive sense of what works, which I could then apply to the more creative thinking. I don't know. Either way, I'm grateful to make a living the way I do.
Qualitative data has another challenge - representativeness. It's very easy to do 10 user interviews and feel comfortable that you understand the market. Our brains lie to us all the time.
Quantitative data lets you drill down into different dimensions. Because it is much easier to collect at scale (it's the sum of your users' interaction w/ your product, after all!), it's much easier to make representative decisions.
No it isn’t because the data will never tell you “why” people are doing something or not doing something. You can guess but you’ll never know why until you a) talk to users and b) watch them use your product. Qualitative research isn’t about statistical significance, it’s about deep insights. 10 user interviews will undercover 100 insights.
What makes you think the 100 insights are actually insights, and not simply deriving from confirmation bias?
To be clear, I support qual + quant user research (+ split testing). They are all tools that help influence product direction. I have, however, seen a lot of cases where qualitative research uncovers an insight that doesn't actually match real world behavior when we bake it into the product.
Compare that to a split test, where I don't know why something is happening, but I am able to optimize against goals. It gives less insight but is more foolproof-actionable. With insight, you have to go insight -> improved_action. With testing, you just have the improved action.
To me, qualitative research is crucial to build a model of the environment. These can be used to generate hypotheses, which you then split test. Any time I see someone take direct insight and build a product feature from it, I have a lot of questions about what alternate ideas they tested. And how to know the one that was developed is as close to optimal as you can get, given the fixed resources available.
Apple famously "ignores" its users, partly because users usually can't see far beyond what is in front of them, often because they don't know about impending advances in technology or clever new designs. They'll ask for faster, cheaper versions of what they already have (faster horses, cheaper buggy whips as they say) rather than the next big thing. Faster/cheaper weren't the primary draws of the Mac, iPod, iPhone, iPad, etc. (though price/performance is a big draw of the M1, the big breakthrough is performance/watt which leads to all-day battery life and better thermals.) Instead it was a quantum improvement in design, usability, and functionality combined.
As another example, consider that in 2007 Apple developers were begging for an iPhone SDK, and Steve Jobs crushed their hopes by telling them to just make web apps. A year later Apple came out not just with an iPhone SDK, but with an entire App Store. (Though I suppose some developers [Epic] and users [HN] wish they had just come out with an SDK, and that the iPhone wasn't locked down.)
That being said, they do a lot of user testing of the next big thing before it is revealed publicly.
It's really not that useful for Apple to collect analytics because they make physical products, where the potential of analytics is limited. When it comes to a SaaS or a web page, the possibilities of analytics are much greater.
And yet, Macs will still send usage and performance data to Apple so they can incorporate that information into future product versions and find out about system software issues.
We're pretty deep in a single thread here, but the original article isn't saying "don't use analytics", it's saying "don't use Google Analytics".
Your SaaS or webpage probably gathers 80% of what Google Analytics does in your log files. It could without doubt provide deeper insights that GA is capable of by adding your own behaviour tracking code (which can be written with the understanding of your specific problem dom ain, rather than be an "everything to everybody" generalised solution).
Nobody is suggesting we shouldn't collect usage and performance data, the thesis here is that we shouldn't send all that directly to the worlds most profitable advertising agency, just because they'll draw us some pretty graphs for free.
Well, you can't really opt-out of all analytics. It's really hard to completely stop all server logs, or them tracking how many purchases you make (eg. invoices).
Where exactly do we draw the line to what's core for a company to track in order to run their business and what can be opt-out?
The issue I have with analytics, additional to your (imho) valid points is that lots of practical users seem to always measure the wrong endpoints and don't include negative outcomes.
Leads are useless without measuring how many people bounced off your shitty online shop, because they couldn't understand your UI.
Staying time is useless without measuring closed tabs, because your website is unreadable with ads.
The irony behind it is that ad metrics are used to buy/sell a website's worth for ads. Sometimes I feel like it's like a bubble that's invented on purpose to not have a measurable outcome of anything.
I agree with you in theory but I'll point out you do measure bounce rate and reoccuring vs new visitors and staying time at the same time to give you that exact view.
But I agree they measure the wrong end points or rather they measure too many end points for me to make sense of which one is better. The goal of overall traffic sometimes works against keeping bounce rates low or staying time high.
In the end you would think products sold matters. But an over promising website will over hype and under deliver causing returns or bad reviews which may affect future sales. The message has to describe the product but still meet the market's demand.
> There's a similar problem with things like "user interviews." A common pitfall is to ask people which features they want.
Asking what users want is not the right way to do interviews. I advise you read "Just enough research" by Erika Hall. The goal of interviews is to gather enough data points to understand the user needs and struggle, and to know their journey through the product with accuracy, and also understanding why they use or don't use a function. It is not a way to get a "wish list".
Then, designers usually have tons of tools and methods to process this data, take decisions and try creative solutions (usually more than one), and play them back to the users through prototypes to see which ones work better. You can check design thinking as a starter, but there are many more.
A similar book is The Mom Test. You're not supposed to let the user find how they would've solved the problem (as had they known, they would've solved it themselves), you're supposed to understand their pain points and then design the most effective solution for them.
I've done so many projects that revolved around instrumenting every little click and creating A and B versions of an experience to run tests against and in all that time I can't recall a single useful product decision that was actually informed by the collected data.
I would disagree on user research. It works on certain types of broad questions like messaging and branding. It's probably overused though.
I'm with you in having come around on not trying to cram everything into that hole of testability. But I do think there's room, specifically when it comes to testing your ideas. It's rare to look at data and see what the problem is, but it's common to come up with a hypothesis for what the problem is and a way to experimentally measure whether you new solution has actually made a dent.
Why is analytics only numbers / quantitative data to you? I've used services like HotJar that record the interactions of the user on the site or app which is qualitative analytics at scale, and it helped me identify how the user was actually using the product. I wouldn't not call this analytics simply because it's not numbers driven.
Why would you need consent to track anonymous data from users, without sharing the data to 3rd parties? What's the difference between recording an event "clicked_purchase" and displaying the click event in a graphical way, as a recording or heatmap?
Do you know of any legal document mentioning this, that storing the clicked element is allowed, but not the click position? They can be both used to run the business and improve the product, thus being in the "necessary" category.
Of course not. And unless I'm in the EU, I don't need it (and even then, not really, HotJar can be made GDPR compliant). HN will definitely balk at me saying that, but understanding UX is much more important than whatever philosophies one has about not tracking users. Because if you don't understand UX, there's no point to the debate about tracking vs not tracking users, because you won't have any users in the first place.
Only on HN would you find people seriously arguing that focusing on product development is a good reason to remove analytics from a site. I've worked almost exclusively at companies where the product is the website, and initially this idea struck me as laughably naive. But let me be fair and think through this.
Tech startups do definitely have this problem of focusing on website analytics where the product is NOT a website or app. If we're generous we can assume many people here develop for these kinds of companies. Some waste a lot of time looking for up-and-to-the-right arrows for investors or trying to be data-focused when data about the website isn't actually all that important. Many of these companies might actually be better off with no analytics to waste time on. I'd still argue it's better to check in every once in a while to look for problems and ask yourself some questions.
The idea of removing analytics where the product is an app or a website is silly. This would be like arguing a grocery store shouldn't track what people are buying from their stores, and instead just source good products. You need to do both. What are you going to do when I ask what is or is not working? Tell me your feelings? Shake an 8-ball? Aside from detecting problems, analytics can be a jumping off point for innovation if you're smart about it. What can we do that's more like what's working? How can we improve this page type?
There are for sure people who over-focus on analytics (often on the wrong data points) instead of creativity, but these are not mutually exclusive. If I were to list the millions of dollars I've earned and saved via analytics this would be a very long post. Sadly, most of those millions were for other people, but it's a very valuable tool for optimizing and creating if you use it correctly.
You could talk to your users... (i.e. UX research), observe them using your website or product and ask them non leading probing questions to see what the intent is behind the behaviors affecting your bottom line. Qualitative research methods are a rich source of insight that is typically underinvested and underutilized.
Analytics (quantitative data) can help you find areas of bottlenecks to explore further by doing qualitative user research and getting to the ‘why’ behind the people problems standing in the way of your metrics you are tracking (retention, adoption, etc.). This is called ‘triangulation’ using quant and qual research methods to understand your users more deeply than just looking at data can achieve.
This is a very important point, thanks for saying this. It’s amazing how many people think they can start an indie SaaS and think that all they have to do is build it, deploy it, and buy AdWords or whatever. Talking to your customers is more important than any of that, and it’s fun.
UX research participants are usually compensated, and so it's usually done to drill down deeper into problems that analytics found and test possible explainations. I don't see what's unscientific about that.
Could be that, it could be design, it could be how something is worded, it could be that what you want the client to see, isn’t being seen. Analytics helps to identify this problem. For instance I noticed recently a huge drop-off in visitors from Ipads, and a redesign apparently made some parts of the site dysfunctional for some ipad users, something we didn’t catch earlier.
> You can't focus on the product without analytics
This is provably false. You do not need intrusive analytics to develop fantastic products.
Have people somehow forgotten about good old-fashioned user testing? It is expensive, time consuming, and amazingly effective. Most importantly you can actually talk to your users because they are people instead of data points.
But then surely, adding a prayer to St. Isidore of Seville [0] before every release will be better than just the both of them, so three > both > one.
I've recently interacted with a startup where the amount of resources they spend on trying to get them both, is making them blind to the power of one. It's not a pretty sight when all the numbers are tracked, plotted and planned on, yet nothing seems to work.
You can misuse everything. As you mentioned: "all the numbers are tracked, plotted and planned on, yet nothing seems to work." Sounds like there's a plan but no execution / follow up. That's not a problem with either analytics or user feedback.
Out-of-the-box, no. With custom work, yes. The amount of effort is not large, and there are off-the-shelf solutions available. The name commonly used is "engagement timer."
By default, GA only sends one hit on page load. If there's no second hit, there's no way to tell if someone was looking at the page for a second or a minute or an hour.
In theory the correct way to send analytics when leaving the page is via navigator.sendBeacon, although it's not clear how reliable it is: https://volument.com/blog/sendbeacon-is-broken (note: read the comments for some rebuttals to the article)
But only if the user goes on to visit another page. You don't see how much time they spent on last page they visited, even if they only visit one page, by default.
This is not the same kind of analytics. You are talking about something like HotJar with heat maps to find out how customers use the product for example.
No, I'm talking about Google Analytics which is the topic of the article and parent comment.
Heat maps are great too but Google Analytics is still used as the foundation for figuring out which types of users are clicking and not clicking on what, both in isolation and as part of a pathway between elements/pages/etc.
Plug: If you're interested I have built a tool to solve this need of having both qualitative data (heatmaps, recordings) and quantitative data (stats, segments), plus it's self-hosted for highest privacy: https://usertrack.net
I worked for a tech company popular with enthusiasts when GDPR was first rolled out. We had a lot of requests from users who wanted us to provide their data per the GDPR allowance. We also had an influx of tech journalists filing GDPR requests in hopes of catching us doing something wrong or tracking too much personal data.
When we sent users their "data", many of them were in disbelief at how little data they received. Many had come to believe that all tech companies are secretly building inventories of user data to sell to 3rd parties, when really most of us just want to know if our heavy users of Feature A are also heavy users of Feature B, or if Feature C is more popular with new users but not old users.
The strange part is that tech companies are taking the brunt of the bad PR for things like gathering customer feedback and serving relevant ads, while traditional companies like cell phone providers and credit card companies are actually selling customer data. The latter doesn't get enough attention despite being a much more widespread issue.
Facebook doesn't sell your data, but your phone provider and credit card company probably do. But ask the average person who's selling their data, and Facebook will get all the blame.
I think Facebook gets the blame because they are at the end of the chain. They don't sell your data but they do sell the access to your data (from themselves through the website or tracking and what they bought from said other companies). Being that major player in the service it makes sense that they get the heat, but at the same time most people are still tech illiterate. I mean look at how people think Amazon is only a retail company.
I have never been inside a newsroom, so I can't know for sure, but I suspect facebook also gets a fair amount of the blame from news media because of their fraught relationship as pseudo-competitors.
Huh, thank you. Obviously the people on the ground don't, but there's no truth to the idea of what stories are greenlit by editors?
Could be as simple as occasionally removing mentions of companies that advertise with the paper in peices about customer data creating the effect, not J. J. Jamison telling people to get him pictures of zuckerberg :D
Relevant ads are the kind you get on DuckDuckGo: relevant to the content you're looking at. E.g. if you look at a site about origami, you get ads from arts&crafts supply stores.
What you've probably meant are called predatory ads: they chase you around the Web wherever you go, like a predator chases its prey.
Statistically speaking, you're more likely to buy a second object right after you bought one than someone who has not shown interest in the product. Things break, you might want to return it for a slightly different version, you might buy one for a friend.
For you, it might be wrong, but when the advertiser is buying millions of ad impressions and is looking for a 0.01% hit rate, the math shows that you're one of the more likely future customers.
Which makes at least a little sense when I'm on another site. But just this morning, Amazon started "recommending" a product to me that I'd actually bought from Amazon two months ago. How many printers does Amazon think I need?
Of course, as you note, printer companies realized people were actually doing this (due to the perverse incentives created by their ridiculous razors/blades business model) so now most printers come with pathetic "starter" ink cartridges that run out after about 10 pages.
The irritating bit though is that many cartridges don't last long (intentionally and unintentionally) once you open/install them, regardless of whether you actually print anything. So, if you don't print a lot, it might still make (financial) sense to buy a new printer... ;-[
I am tempted though to get a model with refillable ink tanks, but most of what little printing I do is on an old b/w laser printer which I've had forever. Still looking for the holy grail color laser printer that is cheap, networked, duplexing, compact, and can print photos.
If "the externalities are not fully priced in", isn't that just good for me? I hate to say it, but my financial situation hasn't been great recently; I might just go for it.
Before the days of the web, we had a solution for this that was better and easier for users and didn't invade privacy: UI design guidelines and UI consistency.
Apple is the only company that still occasionally does this, though even they have driven off into the realm of every application having its own entirely novel interface. Go back in time to MacOS or Windows in the 1990s and you'll find an entirely different paradigm: every application has the same interface... or at least the same interface paradigms. Learn the computer once and you've learned the computer.
Features were remarkably discoverable. They were organized logically in menus. Keyboard shortcuts were always available and usually intuitive. I remember opening a new app I'd never used before on Windows 95 and just unthinkingly hitting a keyboard combo and it doing the general thing I expected, or mousing to where I expected to find a feature I imagined should be there only to find that it actually was there. I never used Mac Classic much but I heard it was similar.
The web is what really broke this. Web UIs overtook desktop UIs due to the difficulty of installing local software and the power of trendiness. Web UIs were never uniform and couldn't be since the web was anarchistic and wild and often driven by designers who wanted to make their product look a specific distinct way.
I remember in the days immediately before the web there being talk of algorithmic generation of UIs from data schema. If the UI is thoroughly standardized then it becomes at least thinkable to examine data structures and generate user interfaces from them, even UIs that aren't horribly ugly or hard to use. This was building on a previous generation of incredible WYSIWYG UI design tools. Then the web came and all that stuff was completely abandoned.
Today's UI design tools in things like Xcode and Android Studio are horrible by comparison to what people were using in 1995. Go back and try Visual Basic on Windows 95. The VB language sucked but the UI designer was aeons ahead of anything we use today.
> Analytics isn't just about conversion. Analytics is about the entire product experience.
... which you can entirely do by analyzing web server logs.
You dont need google for this, pushing your users (with violation of GDPR consent) into google monstrosity. Dont do that, I block every freaking google domain from cdns, fonts to analytics.
But I don't block 1st party analytics.
I have no reason to. I have visited your site, I dont have anything against YOU following what I read. It is your site. But you will ask me for consent for giving those data to google. And I will say 'no'. And I am not the only one.
Just to inform you, that the person/entity that allows Google to gain access to PII data is directly responsible for this - if google is fined due to GDPR violation, you can get fined to by providing it the way to get users data. They will survive. You might not.
Have your analytics, but you will not sell my soul (which GDPR explicitly forbids you - you are handing over my PII data to 3rd party that is known for violating it and that makes you accomplice) for you having your graphs.
You can get those data from web server logs. You will have all the data that you need. Actually more data, as no one will block them.
Needing "google analytics" is just a huge, giant, hype driven, lie. You don't need them to analyze what you already have in YOUR logs.
chaos_emergent: please do explain, what data google analytics is offering to you than what is already in your server logs? Without violating GDPR even more? Yes, you can surely track something more, again "on your side". Dont give it to google as it WILL get blocked and you will have a distorted picture of how your site is being used. If you want real data, skip 3rd party analytics. Found a way to require to be unblocked? I will skip your site, you have just lost a user. A paying user if the content is worth the money. And sites with selling my data for a graph or two are not worth it.
I agree that people shouldn't be using Google Analytics. I disagree that people should just rely on their web server logs. Products are more than the data that is being accessed on them - copy and design make a product usable and don't show up on web servers. Am I missing something?
FWIW, there are many well known techniques for shipping data to GA regardless if you block it or not. Many integrate server-side for this reason (as you say - server logs are very rich), and client side is used supplementally. Some are using sneaky techniques to move requests through 1P domains. Adblockers make zero difference.
> Some are using sneaky techniques to move requests through 1P domains
Just for an info, I have written a mitming proxy that takes care of those (cname cloaking) and a lot of other things, including fingerprinting, supercookies etc., with support for various blocklist (domain to adblock), js injection where the injected scripts are handled in same manner as blocklist (you can stockpile them and make rules), changing validity of cookies (to session cookies for instance), saves your data trough highly effective caching and even helps spying cdns save some bandwidth as they are mostly no longer visited. And yes, works as transparent proxy too (for router enforced usage). Imagine fully armored firefox for whole network, regardless of browser, freebsd,linux,windows,arm8. Working on cross platform gui, release soon.
Yes, but analytics is also used to trick folks into buying a paper back full of shit.
A good product, that does what it advertises, does not need analytics. But a bad product, that somebody desperately wants to make successful, or at least successful enough to sell to a PE and exit, needs analytics.
The problem is that when making a product you're often wrong, what you think is a good product is often a bad product, or it's a nearly good product with a couple of fatal flaws that can only be seen in hindsight. S
While some people have an uncanny sense of vision, and seem hit on the right ingredients more often than seems fair, but most companies aren't led by this kind of person.
You need things that tell you when and how to course-correct, this is what analytics gives you. Now, of course, this needs to be balanced against privacy concerns. I push back on things that track literally everything (the tools that record every click and cursor movement are fascinating, but undeniably creep), and I try to avoid sending any PII to 3rd-parties. The amount of stuff Google Analytics phones home about by default is also pretty troubling.
I'm on board with basically every privacy-based criticism of tracking, but I don't buy this argument that only bad products benefit from it.
> the tools that record every click and cursor movement are fascinating, but undeniably creep
That may not be so for a game developer who wants to modify gameplay that is heavy or reliant on things like particular mouse cursor movement and mouse click usage. Especially if a meticulous gameplay goal is the objective.
"If they're performing a task quickly because it's easy, or slowly because they're struggling with the UX. And you find out that customers on a certain mobile device are suffering huge performance issues, for example."
...
"You don't know these things until you measure them."
If you don't build big, bloated tools using ultra-high-level frameworks and if your product is a simple tool that performs a single, useful task ... then you do know these things and you don't need analytics to tell you.
Huh? Analytics is how you focus on your product.
By instrumenting your product with analytics, you can find out if customers are using your new useful feature or if they can't find it. If they're performing a task quickly because it's easy, or slowly because they're struggling with the UX. And you find out that customers on a certain mobile device are suffering huge performance issues, for example.
You don't know these things until you measure them. That's analytics.
Obviously analytics are only one piece of product improvement -- there's sitting down with users for 30 minutes to watch them use the product, interviews, surveys, etc.
But analytics are a critical piece. You can't focus on the product without analytics.
Analytics isn't just about conversion. Analytics is about the entire product experience.