• 0 Posts
  • 560 Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle


  • I think that one was also significantly a publicity thing, they made videos and announced it as a neat story about the air force doing something “neat” and connecting relatable gaming platform to supercomputing. I’m sure some work was actually done, but I think they wouldn’t have bothered if the same sort of device was not so “cool”

    There were a handful of such efforts that pushed a few thousand units. Given PS3 volumes were over 80 million, I doubt Sony lost any sleep over those. I recall if anything Sony using those as marketing collateral to say how awesome their platform was. The losses from those efforts being well with the marketing collateral.


  • That’s just pointing out upgrades carry a large price, not that the base model is at a loss.

    Which is a super common strategy in pre built, especially in systems that can’t in theory take third party upgrades. Commonly a mobile platform will charge a hundred dollar premium for like 20 dollars worth of UFS storage. At least at some points PC vendors have done DIMM SPD lockouts to force customers to first party so they can charge a significant multiple of market rate for their parts.

    I doubt anything in Apple’s lineup is sold at a loss. They might tolerate slimmer margins on entry, but I just don’t think they go negative.



  • I think that was overstated. Sure there were some “fun” projects for fun or publicity.

    However supercomputer clusters require higher performance interconnect than PS3 could do. At that time it would have been DDR infiniband (about 20 Gbps) or 10 g myrinet.

    Sure gigabit was prevalent, but generally at places that would also have little tolerance for something as “weird” as the cell processor.

    OtherOS was squashed out of fear of the larger jailbreak surface.







  • Pretty spot on, it was so worth it to remember, that Valve actually seemed to remember.

    Their first go at it was “make a viable platform and the developers/publishers will make the effort to come over, and hardware partners will step up with offerings because of Valve’s brand strength and fear of the Microsoft Store screwing everything up”. That didn’t work, and Microsoft Store also didn’t pan out as far as Valve and others feared, but they have been kind of screwing up the platform particularly for games as they chase other things that would be subscription revenue instead of transactional revenue.

    Valve learned they needed to work harder to bring the platform to the Windows games, so heavy investment in Proton. They learned that they had to take the hardware platform in their own hands because the OEMs aren’t committed until they see proof it can work for them. They learned that the best way to package their improved efforts was with a “hook” with mass-market appeal, enter the Steam Deck, recognizing the popularity of the Switch form factor and bringing it to the PC market at a time no one else was bothering.

    So now they have a non-Android, non-Windows ecosystem that covers handheld, console/desk, and VR with a compelling library of thousands and thousands of games…


  • This is more thinking about material cost rather than relative value. If you save money on the passthrough and incur a few costs above the Quest 3 but nothing dramatic, then I’m just saying the pricing needs to be in the ballpark of Quest 3. Better value by making smarter choices that may not have a cost impact (e.g. using a maintstream high end SoC instead of a niche SoC, putting the battery at the back instead of making it front heavy).

    Of course they may be hampered by different business needs. Meta affording to risk more money than Valve can risk might drive higher price point, but it would be unfortunate.


  • The SoC may be better, but I don’t know that it would be more expensive. Meta went with a more niche SoC and Valve selected a more mainstream, newer SoC. Better specs, but also larger volumes so cost wise I think Valve should be fine. Comfort certainly seems like it should be better, but I don’t know that I see more cost as a factor versus just making better decisions.

    The wireless dongle certainly can be a thing in it’s favor, just thinking that on balance there’s some things that should contribute to BOM price and some that should save on BOM price and it should, roughly, be in the ballpark of Quest 3 when all is said and done, not 2x the cost.


  • Well even with your observation, it could well be losing share to Mac and Linux. The Windows users are more likely to jump ship, and Mac and Linux users tend to stick with the platform more, mainly because it’s not actively working to piss them off. Even if zero jump to Mac or Linux, the share could still shift.

    The upside of ‘just a machine to run a browser’ is that it’s easier than ever to live with Linux desktop, since that nagging application or two that keeps you on Windows has likely moved to browser hosted anyway. Downside of course being that it’s much more likely that app extracts a monthly fee from you instead of ‘just buying it’.

    Currently for work I’m all Linux, precisely because work was forced to buy Office365 anyway, and the web versions work almost as well as the desktop versions for my purposes (I did have to boot Windows because I had to work on a Presentation and the weird ass “master slide” needed to be edited, and for whatever reason that is not allowed on the web). VSCode natively supports linux (well ‘native’, it’s a browser app disguised as a desktop app), but I would generally prefer Kate anyway (except work is now tracking our Github Copilot usage, and so I have to let Copilot throw suggestions at me to discard in VSCode or else get punished for failing to meet stupid objectives).


  • “Agentic” is the buzzword to distinguish “LLM will tell you how to do it” versus “LLM will just execute the commands it thinks are right”.

    Particularly if a process is GUI driven, Agentic is seen as a more theoretically useful approach since a LLM ‘how-to’ would still be tedious to walk through yourself.

    Given how LLM usually mis-predicts and doesn’t do what I want, I’m no where near the point where I’d trust “Agentic” approaches. Hypothetically if it could be constrained to a domain where it can’t do anything that can’t trivially be undone, maybe, but given for example a recent VS Code issue where it turned out the “jail” placed around Agentic operations turned out to be ineffective, I’m not thinking too much of such claimed mitigations.


  • My career is supporting business Linux users, and to be honest I can see why people might be reluctant to take on the Linux users.

    “Hey, we implemented a standard partition scheme that allocates almost all our space to /usr and /var, your installer using ‘/opt’ doesn’t give us room to work with” versus “Hey, your software went into /usr/local, but clearly the Linux filesystem standard is for such software to go into /opt”. Good news is that Linux is flexible and sometimes you can point out “you can bind mount /opt to whatever you want” but then some of them will counter “that sounds like too much of a hack, change it the way we want”. Now this example by itself is mostly simple enough, make this facet configurable. But rinse and repeat for just an insane amount of possible choices. Another group at my company supports Linux, but just as a whole virtual machine provided by the company, the user doesn’t get to pick the distribution or even access bash on the thing, because they hate the concept of trying to support linux users.

    Extra challenge, supporting an open source project with the Linux community. “I rewrote your database backend to force all reads to be aligned at 16k boundaries because I made a RAID of 4k disks and think 16k alignment would work really well with my storage setup, but ended up cramming up to 16k of garbage into some results and I’m going to complain about the data corruption and you won’t know about my modification until we screen share and you try to trace and see some seeks that don’t make sense”.




  • Except he directly said just that.

    Generally I agree that often he’ll make some flub and a bigger deal is made of it. Like with the ‘Miracle Mile’ vs. ‘Maginficent Mile’ thing, he said the wrong thing but that’s the least of the problems with that story and a fairly mundane and understandable mistake to make.

    This time the statement is exactly as said, though real world consequences for it are similarly low.