Jade plants were very popular as house plant in the soviet union, everyone had one at home. This could be a reason why AI got confused there
Jade plants were very popular as house plant in the soviet union, everyone had one at home. This could be a reason why AI got confused there
1.5 ton aluminum bar, chilled to absolute zero. This would be a massive project to setup
I got 84 too, looks like we see colors different from the normal people
I forgot I have night mode enabled on my phone, after turning it off I got closer to average, oops
Funny you mentioned arch, as steam deck os is based on arch, so it is using arch btw
Hydrogen Sulfide can damage concrete, not sure about the chair tho
It’s the other way around, you will get all of the tickets which are missing plate info. Some guy did it and regrets it, there is a documentary about it.
17 thousand years old, but looks great, better than some medieval drawings
Centrifuge spins really fast so you need to balance where you put the samples, or else it will vibrate. The trick is to put them on the opposite side or equally spaced apart from each other.
I can’t quite understand what is your point? Are you arguing that both JVM and WASM are bad? With this I agree, they both have terrible performance and in an ideal world we wouldn’t use any of them.
Are you arguing that JVM bytecode is better than WASM? That’s objectively not true. One example is a function pointer in C. To compile it to JVM bytecode you would need to convert it to the virtual call using some very roundabout way. But in WASM you have native support for function pointers, which gives much better flexibility when compiling other languages.
Have you seen what it outputs? The same way we can compile C to brainfuck, it doesn’t mean it’s good or is useful.
You can’t compile C to java bytecode, they are fundamentally incompatible. But you can compile C to wasm, which is what you want for a good universal bytecode. Java is shit.
This is per kilogram of your mass. So if your weight is 80kg then the lethal dose would be 96000mg not 1200. At least that’s how I understand this.
His haircut reminds me of someone
For example when watching 1080p youtube video in Safari the power consumption is only 0.1watt because it’s using hardware decoders. (not including display backlight, I can’t measure it). But when I play the same video in firefox which is using software decoding the consumption is around 0.7w which is not as good as hw decoders, but still less than a watt
no, it’s just an easy sustained load that can be measured accurately. If you have some other application that provides sustained load but doesn’t spin all the cores to 100% please suggest it, I will try.
I did some actual measurements just to confirm it, here is minecraft in default configuration running @ 100fps and the cpu+gpu consumption is around 6w in total. If you add about 5w for display backlight and other components the total would be 9-10 hours of play time on my 100wh battery.
Can you please take the same measurements on your system? Maybe ryzen system is better than intel, never had one.
Why mips and not RiscV? I would assume it’s easier to emulate in software and has good support in linux
EDIT: found it
Could it be the pc relative addressing often used on risc-v would be slow to run on 4004?