Some complicated pipelining optimization that claims to reduce the critical path of each pipeline stage, increasing the clock speed and/or decreasing the required voltage for a given clock speed, written in impenetrable patentese.
The easiest pipeline optimisation, which is basically free hardware-wise and is documented in the academic literature for years, is to move the computation of W + K + H to the previous pipeline stage, reducing the critical path. This is a trivial win because all the data is available already and H can be discarded afterwards, which also has the nice bonus of mapping more efficiently to certain FPGA hardware. It sounds like Intel have taken this and expanded on it to move even more things into earlier and later stages of the pipeline.
Or, as Intel puts it, "The precomputed (H.sub.i+K.sub.i+W.sub.i) may be stored in the 32-bit register 402 dedicated for H.sub.i. This optimization reduces the critical path for the computation of E.sub.i+1 by one CSA or approximately three logic gates." I implemented this back in 2011 in some open source FPGA mining code, and it was an old trick even back then. They don't seem to be citing any prior art, which is a bit dubious.
SHA-256 operates almost entirely on 32-bit values, because it's designed to be efficient to compute on general-purpose CPUs without special hardware support. The SHA-256 state consists of 8 32-bit integers, of which 2 are updated each round. (SHA-512 uses 64-bit integers instead for some reason.)
I wouldn’t read too much into this. Intel have very generous employee “patent creation” bonuses that encourage them to constantly file patents, ca. $2500 per patent IIRC.
That being said, it’s still interesting seeing Intel encouraging and embracing crypto ASIC design.
Is that just a flat $2500, no profit/royalty sharing? Seems like a good way to get people to churn out lots of low quality patents.
I like the scheme where I work--there's a relatively small cash bonus (a few hundred bucks), but the inventors get a nice chunk of any income from the patent. Another slice goes to the inventor's department's budget, so they can indirectly benefit there too.
When I was at Microsoft, it was $1000 and a trophy called a 'Patent Cube". I knew a couple of people who had a profitable sideline of manically brainstorming bullshit patents together, for every little thing they worked on. They had stacks of these trophies in their offices.
The award has gone up a bit since, but the patent cube is still well and alive. My first arrives in a few weeks. Can confirm the stacks - knew a dev who used them as raisers for his monitors and consoles.
How does that work? You may have an idea but the actual development may involve 100s of people and the patent might involve a small aspect of the product. How do assign a value to the patent?
Also you are right that this system results in lots of low quality patents. I used to see at one employer chip designers cranking out 5-6 cell designs on a quarterly basis making an $10k extra a quarter. Maybe 1% of those would actually be implemented but it is worth it for the company to get every potential solution either as prior art or patented.
Like someone else said, mostly licensing. The place where I work is a bit unusual (non profit, don't really have "products" and when we do it's typically a small team so everyone gets to be on the patent). I'm sure it's harder to divvy up at Intel, where a new CPU generation might take advantage of dozens of patents.
If your company's business model is to license patents, then you should align your employees' incentives to that, give them money when patents are licensed. That's what your employer does.
If your company's business model is to build and sell products, and prevent competitors from being as good, and avoid getting sued, then it's beneficial to have as many patents as possible. So Intel also aligns its employees' incentives with with its business model, by giving an incentive to create as many patents as possible.
It differs by company, but is not that easily gameable. (I’ve tried many times and succeeded only once in filing an application that was completely trash). The company’s patent lawyers ultimately decide what they will and won’t file, and they are more time constrained than budget constrained).
I've seen a split approach like 50% at disclosure and 50% at grant because the employer wants to see you follow through in assisting with the paperwork. Quite a few never reach grant stage or take so long that you may have even moved to a new employer.
Intel encourages employees to file all sorts of crappy patents, but I don't see it as a bad or evil practice, but rather as a defensive strategy.
Meaning that they don't really enforce patent usage on others, instead they need to have a trove of own patents as defense against patent trolls.
>instead they need to have a trove of own patents as defense against patent trolls
but the whole point of patent trolling is that you're suing via a non-practicing entity, which a shell company that does nothing but hold patents, so there's nothing to sue back at.
Most likely it doesn't help against trolls but strengthens their position in cross-licensing negotiations with other companies. When a company's portfolio includes many thousands of patents, due diligence to discover the important ones becomes impossible and it's then a numbers game.
As usual the patent gets awarded to the party that doesn't have a product or even a proof of concept while the market has already moved on and has been selling the product for years.
Most patents that I come across read as landgrabs, not as companies that actually have working prototypes, proofs of concept or any plans to produce those.
Being late never stopped anybody from filing for a patent.
I've been sued by patent trolls that somehow got a patent on live streaming video to the browser way past where we were doing it in public. Others have been on the receiving end of similar lawsuits. Fortunately our efforts were very well documented so we managed to get rid of them but it was quite annoying that such a patent ever got granted. Especially because I made a conscious decision not to patent it myself because I felt it was a trivial thing and the time was simply ripe for doing it.
The narrative (whether correct or incorrect) is that Bitmain tried to block an extension because it would disable a feature in their hardware that was saving them tens to hundreds of millions of dollars per year on electricity.
That has nothing to do with patents, it's (supposedly) an advantage they had over the competition, of course they are going to defend that advantage.
Patents for mining hardware though are unenforceable overall. The problem is that lots of private mining farms exist, and nothing is stopping one from developing a chip that is in violation of many patents, but then never releasing the hardware publicly. There would be no way to tell that the patent is being violated, especially because mining is more or less anonymous in the first place.
No, you’re spreading opinionated propaganda, and a political witchhunt. And I don’t think this is appropriate for HN.
/r/bitcoin is heavily censored and is an echo chamber. It is no way representative of the bitcoin community.
Go to any in person Meetup and you know the majority of old bitcoin era and experts are not on /r/bitcoin’s side, and wanted the block size raised a year ago. I’d suggest diversifying your news sources because /r/Bitcoin is more biased than Breitbart.
AsicBoost can be potentially used by anyone since it is now a defensive patent. Bitmain cares about this patent because they need to disclose all other patents they have if they want to use AsicBoost: "There Is a Bitcoin Patent War Going On, but This Initiative Could End It" [1]
That was only announced last month. I suspect the only reason they promised to use it for defensive purposes is because its utility is significantly reduced due to successful roll-out of segwit. Had segwit not have passed, I doubt they would leave hundreds of thousands of dollars on the table by not licensing it.
Segwit did not break covert asicboost. The math proved that wrong, and that one could make very small, inexpensive changes to have it continue working with segwit.
This was admitted by Greg Maxwell in the dev mailing lists, when they proposed an asicboost blocker only change, that they rescinded because they realized that it and segwit never actually stopped covert asicboost.
It was also confirmed by experts like chjj (CTO of purse.io, and Bitcoin developer) that segwit never blocked covert asicboost.
Tried to read the claims and realized again that patents are unreadable to an engineer and completely useless for any useful advancement of the field.
Clearly, somebody was working on a mining chip design and took advantage of the patent incentives these companies offer employees to build a warchest. The lawyers then took the description they provided and turned into this steaming pile of technolegal jargon.
(a) it is first /inventor/ to file. You cannot patent stuff you (for various values of you) didn't invent.
(b) it is still subject to prior art disclosures.
The easiest pipeline optimisation, which is basically free hardware-wise and is documented in the academic literature for years, is to move the computation of W + K + H to the previous pipeline stage, reducing the critical path. This is a trivial win because all the data is available already and H can be discarded afterwards, which also has the nice bonus of mapping more efficiently to certain FPGA hardware. It sounds like Intel have taken this and expanded on it to move even more things into earlier and later stages of the pipeline.
Or, as Intel puts it, "The precomputed (H.sub.i+K.sub.i+W.sub.i) may be stored in the 32-bit register 402 dedicated for H.sub.i. This optimization reduces the critical path for the computation of E.sub.i+1 by one CSA or approximately three logic gates." I implemented this back in 2011 in some open source FPGA mining code, and it was an old trick even back then. They don't seem to be citing any prior art, which is a bit dubious.