Actually I think this is a pretty common thing. I know several people who use iPhones and other Apple products specifically to avoid the google alternatives.
Actually I think this is a pretty common thing. I know several people who use iPhones and other Apple products specifically to avoid the google alternatives.
Organizations aren’t just paying for access to applications, they’re also paying for cloud storage, email hosting, calendar tools, training, and all of the infrastructure to support that. Typically when you price out the cost of expanding the in-house IT department and the cost of acquiring and maintaining the infrastructure required to replicate the various cloud services, it ends up being break even at best. Qualified people who can set up and maintain infrastructure are quite expensive, especially when having to maintain high uptime/availability, 24/7 incident response, and compliance with various regulations, like those to protect students’ privacy.
Why was Web 2.0 a mistake and what does that have to do with centralization?
As someone who works with and knows several military contractors, I’ve never heard of the US taking ownership of any code written. In fact, most of what they’re paying for is for companies to extend software they’ve already written to better fit the governments use case, such that even if the government owned the new improvements, that code wouldn’t function without the base application that pre-dates a government contract.
The internet was developed by ARPA, then later made available to universities and eventually private connections. Military and public research developed the tech, capitalists figured out how to most efficiently sell junk using the tech.