In the area I live, this would mean you could be standing right next to the pizza cooking bore, and still be outside of the delivery range.
In the area I live, this would mean you could be standing right next to the pizza cooking bore, and still be outside of the delivery range.
Indeed or Monster or someone should just publish an open source JSON spec, and then by sheer weight make it the default. I don’t understand why they haven’t done so.
Have you tried turning them off, then turning them on again?
I’ve seen a few projects rename during major version upgrades, when everyone has to read the release notes and make changes, anyways.
Plenty of old deployed systems may continue using master/slave terminology, and of course some projects will stick to that language even decades in the future, but it was once more prevalent than it is now, and that declining trend looks like it will continue.
I think we’re still headed up the peak of inflated expectations. Quantum computing may be better at a category of problems that do a significant amount of math on a small amount of data. Traditional computing is likely to stay better at anything that requires a large amount of input data, or a large amount of output data, or only uses a small amount of math to transform the inputs to the outputs.
Anything you do with SQL, spreadsheets, images, music and video, and basically anything involved in rendering is pretty much untouchable. On the other hand, a limited number of use cases (cryptography, cryptocurrencies, maybe even AI/ML) might be much cheaper and fasrer with a quantum computer. There are possible military applications, so countries with big militaries are spending until they know whether that’s a weakness or not. If it turns out they can’t do any of the things that looked possible from the expectation peak, the whole industry will fizzle.
As for my opinion, comparing QC to early silicon computers is very misleading, because early computers improved by becoming way smaller. QC is far closer to the minimum possible size already, so there won’t be a comparable, “then grow the circuit size by a factor of ten million” step. I think they probably can’t do anything world shaking.
I think very few people mind changing it, and a few people want it changed, so it’s slowly shifting across various use cases. I’ve only discussed the change from master/slave terminology with one person that affirmatively supported the change, and they didn’t know that there’s still slavery in the world today.
I don’t know what to make of that, other than to say ending human slavery ought to be a higher priority than ending references to it.
You can buy high (97-99) CRI LEDs for things like the film industry, where it really does matter. They are very expensive, but can pay for themselves with longer service life, and lower power draw for long term installations.
The CRI on regular LED bulbs was climbing for a long time, but it seems as though 90ish is “good enough” most of the time.
You can just issue new certificates one per year, and otherwise keep your personal root CA encrypted. If someone is into your system to the point they can get the key as you use it, there are bigger things to worry about than them impersonating your own services to you.
A lot of businesses use the last 4 digits separately for some purposes, which means that even if it’s salted, you are only getting 110,000 total options, which is trivial to run through.
Don’t joke about this, the college professors will hear you.
The game theory one is easy. Put down 999,999,999,999 factorial. Then everyone got it wrong, and the curve will reflect that.
When a monopoly is faced with a smaller, more efficient competitor, they cut prices to keep people from switching, or buy the new competitor, make themselves more efficient, and increase profits.
When Steam was faced with smaller competition that charged lower prices, they did - nothing. They’re not the leader because of a trick, or clever marketing, but because they give both publishers and gamers a huge stack of things they want.