• 0 Posts
  • 82 Comments
Joined 3 years ago
cake
Cake day: July 5th, 2023

help-circle
  • It’s not feasible for a mass market consumer product like Starlink.

    Why not? That’s a service designed to serve millions of simultaneous users from nearly 10,000 satellites. These systems have to be designed to be at least somewhat resistant to unintentional interference, which means it is usually quite resistant to intentional jamming.

    Any modern RF protocol is going to use multiple frequencies, timing slots, and physical locations in three dimensional space.

    And so the reports out of Iran is that Starlink service is degraded in places but not fully blocked. It’s a cat and mouse game out there.


  • I’d think that there are practical limits to jamming. After all, jamming doesn’t just make radio impossible, it just makes the transmitter and receiver need to get closer together (so that their signal strength in that shorter distance is strong enough to overcome the jamming from further away). Most receivers filter out the frequencies they’re not looking for, so any jammer will need to actually be hitting that receiver with that specific frequency. And many modern antenna arrays rely on beamforming techniques less susceptible to unintentional interference or intentional jamming that is coming from a different direction than where it’s looking. Even less modern antennas can be heavily directional based on the physical design.

    If you’re trying to jam a city block, with a 100m radius, of any and all frequencies that radios use, that’s gonna take some serious power. Which will require cooling equipment if you want to keep it on continuously.

    If you’re trying to jam an entire city, though, that just might not be practical to hit literally every frequency that a satellite might be using.

    I don’t know enough about the actual power and equipment requirements, but it seems like blocking satellite communications between satellites you don’t control and transceivers scattered throughout a large territory is more difficult than you’re making it sound.






  • My first Linux distro was Ubuntu in 2006, with a graphical installer from the boot CD. It was revolutionary in my eyes, because WinXP was still installed using a curses-like text interface at the time. As I remember, installing Ubuntu was significantly easier than installing WinXP (and then wireless Internet support was basically shit in either OS at the time).


  • Specifically, desktop RAM is slabs of silicon, placed into little packages, soldered onto circuit boards in DIMM form or similar, to be plugged into a motherboard slot for RAM.

    The AI demand is for the silicon itself, using advanced packaging techniques to put them on the same package as the complex GPUs with very high bandwidth. So these same pieces of silicon are not even being put into DIMMs, so that if they fall out of use they’ll be pretty much intertwined with chips in form factors that a consumer can’t easily make use of.

    There’s not really an easy way to bring that memory back into the consumer market, even after the AI bubble bursts.




  • End to End Encryption protects the messages *between the ends". If an “end” is compromised the best E2EE technology isn’t going to protect confidentiality.

    Just ask Pete Hegseth, who invited a journalist into an E2EE signal chat. The journalist was an authorized “end” and could therefore read the conversation.

    This change is about employers who already have full access to the “end” of the Android phone itself when that phone is in an enterprise managed state. Perfect encryption between that phone and other parties doesn’t change anything because the employer has full access to the phone itself.




  • Javascript for this seems like the wrong tool. The http server itself can usually be configured to serve alternative images (including different formats) to supporting browsers, where it serves JXL if supported, falls back to webp if not, and falls back to JPEG if webp isn’t supported.

    And the increased server side adoption for JXL can run up the stats to encourage the Chromium team to resume support for JXL, and encourage the Firefox team to move support out from nightly behind a flag, especially because one of the most popular competing browsers (Safari on Apple devices) does already support JXL.


  • It’s not too late.

    The current standard on the web is JPEG for photographic images. Everyone agrees that it’s an inefficient standard in terms of quality for file size, and that its 8-bit RGB support isn’t enough for higher dynamic range or transparency. So the different stakeholders have been exploring new modern formats for different things:

    WEBP is open source and royalty free, and has wide support, especially by Google (who controls a major image search engine and the dominant web browser), and is more efficient than JPEG and PNG in lossy and lossless compression. It’s 15 years old and is showing its age as we move towards cameras that capture better dynamic range than the 8-bit limits of webp (or JPEG for that matter). It’s still being updated, so things like transparency have been added (but aren’t supported by all webp software).

    AVIF supports HDR and has even better file size efficiency than webp. It’s also open source and royalty free, and is maintained by the Linux Foundation (for those who prefer a format controlled by a nonprofit). It supports transparency and animation out of the box, so it doesn’t encounter the same partial support issues as webp. One drawback is that the AVIF format requires a bit more computational power to encode or decode.

    HEIC is more efficient than JPEG, supports high bit depth and transparency, but is encumbered by patents so that support requires royalty payments. The only reason why it’s in the conversation is because it has extensive hardware acceleration support by virtue of its reliance on the HEVC/h.265 codec, and because it’s Apple’s default image format for new pictures taken by its iPhone/iPad cameras.

    JPEG XL has the best of all possible worlds. It supports higher bit depths, transparency, animation, lossless compression. It’s open source and royalty free. And most importantly, it has a dedicated compression path for taking existing JPEG images and losslessly shrinking the file size. That’s really important for the vast majority of digitally stored images, because people tend to only have the compressed JPEG version. The actual encoding and decoding is less computationally intensive than webp or avif. It’s a robust enough standard for not just web images, but raw camera captures (potentially replacing DNG and similar formats), raw document scans and other captured imagery (replacing TIFF), and large scale printing (where TIFF is still often in the workflow).

    So even as webp and avif and heic show up in more and more places, the constant push forward still allows JXL to compete on its own merits. If nothing else, JXL is the only drop in replacement where web servers can silently serve the JXL version of a file when supported, even if the “original” image uploaded to the site was in JPEG format, with basically zero drawbacks. But even on everything else, the technical advantages might support processing and workflows in JXL, from capture to processing to printing.


  • It’s like the relationship between mathematics and accounting. Sure, almost everything accountants do involve math in some way, but it’s relatively simple math that is a tiny subset of what all of mathematics is about, and the actual study of math doesn’t really touch on the principles of accounting.

    Computer science is a theoretical discipline that can be studied without computers. It’s about complexity theory and algorithms and data structures and the mathematical/logical foundations of computing. Actual practical programming work doesn’t really touch on that, although many people are aware of those concepts and might keep them in the back of their mind while coding.


  • People who get downvoted a lot end up with a ‘low reputation’ indicator next to their name. You’ll know it when you see it.

    Upvotes in meme communities do not add to reputation.

    I think any kind of reputation score should be community specific. There are users whose commenting style fits one community but not another, and their overall reputation should be understood in the context of which communities actually like them rather than some kind of global average.