Time to file the divorce papers.
Time to file the divorce papers.
I do think it’s target audience was kids. I had a younger family member (about 12) introduce me to it at the time. I got a kick out of the gameplay and styles. It was sort of a spoof of a 90’s video game (name I can’t recall) with a common theme then - sort of over-the-top, “we’re both in on the joke” kind of thing.
I’m sure they will. It’s always a cat-and-mouse game.
It’s been a while since I read about DRM, but what I recall the challenge is not being able to control end-to-end, which is what really drives trusted boot efforts in both Android and Windows.
If you don’t control the hardware and OS, then someone can use it to sidestep DRM.
Oh, I get what they’re doing, but I resent their approach.
So many just introduced the subscription to sucker the naive.
I don’t mind paying for software. So let me pay for a major version, and if I want a major update, that costs too. I have so much software where a given version works just fine (FolderSync for example, and Office 2016),that I see no need to upgrade.
It was about being fun, not to be a serious game, so approach it that way. It has a bit of silliness.
Remember it came out around the time the Austin Powers movies were a big hit.
Meh. A motorcycle will split a deer in two. Not really the hill to be dying on.
Are you looking for selective sync, and just over the LAN or over the internet too?
If just LAN, there’s many Windows sync tools for this with varying levels of complexity and capability. Even just a simple batch file with a copy command.
I’ll often just setup a Robocopy job for something that’s a regular sync.
If you open files over a network connection, they stay remote and remain remote when you save. Though this isn’t best practice (Windows and apps are known for having hiccups with remotely opened files).
Two other approaches:
ResilioSync enables selective sync. If you change a file you’ve synchronized locally, the changed file will sync back to the source.
Mesh network such as Wireguard, Tailscale, Hamachi. Each enables you to maintain an encrypted connection between your devices that the system sees as a LAN (with encryption). If you’re only using Windows, I’d recommend starting with Hamachi, it’s easier to get started. If mobile device support is needed, use Wireguard or Tailscale (Tailscale uses Wireguard, but easier to setup).
I don’t believe the ISP would have legal standing to take you to court, as they don’t hold the copyright.
They’re in the middle, being told hy copyright holders that someone using their service is violating copyright, and they must “do something”.
Eventually they may shut off your service, but I haven’t heard of it happening since the early 2000’s.
I refuse to subscribe to apps. Devs doing so for no good reason get 1 star, and I delete the app.
Screw em.
Now, if an app has a back end, or has to host a resolver (Resilio Sync, Tailscale, etc), or provide other necessary services, that’s different.
Cloud can be surprisingly cost effective, as part of a 3-2-1 backup.
Check out storj.io
If it’s powered off, you’ll have no idea when it dies. And they do die just sitting there.
I’ve actually had more failures of drives sitting around than ones running constantly.
Just that you don’t need a beast of a machine (with it’s higher cost and power consumption) to just serve files at reasonable performance. If you want to stream video, you’ll need greater performance.
For example, my NAS is ten years old, runs on ARM, with maybe 2gigs of ram. It supposedly can host services and stream video. It can’t. But it’s power draw is about 4 watts at idle.
My newer (5 year old) small form factor desktop has a multi-core Intel cpu, true gigabit network card, a decent video card, with an idle draw of under 12 watts, and peaks at 200w when I’m converting video. It can easily stream videos.
My gaming desktop draws 200w at idle.
My SFF and gaming rig are both overkill for simple file sharing, and both cost 2x to 4x more than the NAS (bought the NAS and SFF second hand). But the NAS can’t really stream video.
Power draw is a massive factor these days, as these devices run 24/7.
RPi is great for it’s incredibly low power draw. The negative of RPi is you still need enclosure, and you’ll have drives that draw power attached to it. In my experience once I’ve built a NAS, RPi doesn’t draw significantly less than my SFF with the same drives installed, as it seems the drives are the greatest consumer. As I mentioned, my SFF with 1TB of storage draws 12 watts, and RPi will draw upwards of 8 watts on its own (my Pi Zero draws 2, but I’d never use it for a NAS). It’s all so close that for me the downside of RPi isn’t worth the difference in power.
Check out storj.io
My experience with all the media servers is not great.
Popped up Jellyfin once again just last weekend and the quality was not great, and it had issues streaming. Just like every time I’ve tried any media server.
The answer for me is a media player pc at the TV running something like Kodi.
Remember, 3-2-1 is all about not fully trusting any one backup.
I have 4 replicants of my data at home (because any one of them could die at any time) with an online backup. Not the best setup, but it’s what I can do at the moment.
As for RAID, that’s a solution for a specific problem(s).
Wow, neat approach.
I don’t update unless I’m bored
Hahahaha, one of my kind!
My upgrades usually occur because I’m setting up a new system anyway, that way my effort is building for tomorrow in addition to the upgrades, and I get testing time to ensure changeover is pretty smooth.
As I said “how to reproduce this in a home setup”.
I’m running multiple machines, paid little for all of them, and they all run at pretty low power. I replicate stuff on a schedule, I and have a cloud backup I verify quarterly.
If OP is thinking about how to ensure uptime (however they define it) and prevent downtime due to upgrades, then looking at how Enterprise does things (the people who use research into this very subject performed by universities and organizations like Microsoft and Google), would be useful.
Nowhere did I tell OP to do things this way, and I’d thank you to not make strawmen of my words.
In the business world it’s pretty common to do staged or switchover upgrades: test new version in a lab environment, iron out the install/config details. Then upgrade a single production server and do a test with a small group of users. Or, build new servers with the new stuff, have a set of users run on it for a while, in this way you can always just move those users back to a known good server.
How do you do this at home? VMs for lots of stuff, or duplicate hardware for NAS type stuff (I’ve read of running TrueNAS in a VM).
To borrow from the preparedness community: if you have 1 you have none, if you have 2 you have 1. As an example, the business world often runs mission-critical systems in a redundant setup in regionally-different data centers, so a storm won’t take them down. The question is how to reproduce this idea in a home lab environment.
The violation they target users for is sharing a video, and that’s usually through a file sharing service like torrenting.
Think of it this way - whatever you watch online via a browser you’re already downloading. Or via an app.
You know, it really tweaks me that torrenting is associates with piracy, when it could’ve become the defacto way to share files between users, if OS devs had just included the protocol in the OS (looking at you Android, but Windows and Apple too).
I’ve often questioned why it wasn’t…