Deja Vu - again

ubuysa

The BSOD Doctor
I've just been reading this article from TechRadar. It's about the advantages/disadvantages of cloud computing and edge computing, the byline is Rethinking remote working infrastructure.

Really?

I remember having exactly the same arguments several decades ago about the advantages/disadvantages of centralised mainframe computers and personal computers. The same arguments for personal computers (reduced latency, data processing at the point of use, improved security) as are now being made for edge computing, were made for personal computing back in the 1990's. And the same arguments for centralised mainframe computers (data and application sharing, better processing power, data backup) as are now made for cloud computing, were made for mainframe computing back in the 1990's.

The rush to personal computing in the 1990's resulted in the (often too late) realisation that there are some things you need a centralised mainframe for (that's why they're still there today) and a colossal waste of money and resources. Please don't tell me the current generation of planners are going to make those same mistakes again? Rethinking remote working infrastructure? Hardly. Been there, done that. :rolleyes:
 

Scott

Behold The Ford Mondeo
Moderator
My guess is it's a re-visit more than a re-think.

There's also another angle though, and this is where my cynical nature comes into play. During this pandemic my company were reactive to the situation rather than proactive. They have MASSIVE capacity of servers for VPN (@Tony1044 actually alluded to installing them in the HP Datacenters)... unfortunately during this time it wasn't enough, and there was actually only enough resource to cover around half the required workforce to work from home. Very short sighted on the companys behalf, they did react of course. You can imagine the thoughts of the higher ups on this matter, they weren't best pleased as it's a substantial expenditure to have a system not up to scratch. But that's the way it goes when you sell ideas to people with large budgets and no idea what they are buying.

IMO it's about the uneducated and inexperienced getting a feel for how things can work if the infrastructure was considered from a more widened view than they currently are. Fluidity and flexibility for growth are going to be the key areas now I think, that's not going to be THAT easy though as there's always a tab to pick up somewhere.

With regards to the actual "rethinking", it's like watching an Apple feature release when they suggest doing something innovative........ that Android did years ago. That's the same "rethinking" that's going on in that article and the trending world. The old guard will sit and scoff as it's all been considered before with a more open minded view than people have cared to consider for such a long time. Now that it's relevant again it's considered innovation.
 

Tony1044

Prolific Poster
It's interesting - as a lot of my work has been around server based computing - i.e. Citrix, Remote Desktop Services etc, this has been an ongoing conversation for, like you say @ubuysa, decades now.

It's notable that Teams use went from c28 million concurrent users to over 80 million in the blink of an eye - MS actually stopped customers standing up new virtual machines because they were using every bit of spare Azure capacity to cope. But cope they did.

The council I am currently working with had to introduce specific hours in the working day when users were allowed to connect to their VPN otherwise it just ground to a halt - designed for around 1,200 concurrent users, it saw over 4,500. And we have to somehow deploy M365 Apps over it from next week...

Centralised computing - specifically RDS or Citrix - had a couple of massive benefits - because only mouse, keyboard and screen data traverse the connection between the client and server, it's very good for low bandwidth environments. The data never leaves the core network so it means opening e.g. massive spreadsheets can take far less time. And of course, it's secure - you can stop saving to local resources, use thin clients so no data persists etc.

They have downsides - they're not easy to optimise for video, are expensive and one user can bring a system down for many others.

As a side note, I am yet to see much in the way of use cases for VDI - you take a traditionally more difficult to manage desktop OS and then throw it on top of a traditionally complex server based computing model and to top it off you then need to have super-fast, super-low latency storage, network etc.

From a security perspective there's a lot to like about Azure (I don't work in the Amazon space but I assume they do similar) - there's a security score which tells you exactly what you're doing right and wrong and what you need to do to improve, for example.

None of the connections between Azure and on-premise resources are initiated from Azure so there's no need for complex VPN or firewall rules*

You get per-second billing of some resources allowing you to ramp up and down as required - great for companies who run say a large end-of-month report (NFU Mutual used to have to have professional workstations because once a month, they ran a grid-based report that used all the combined processing power. It was clever, but other than that one report they were massively overpowered for their day to day uses).

I've never been much of a fan of the term "cloud based computing" - it's just someone else's servers. "Utility based" makes more sense when you think about it, because, like electricity, gas and water, a consumer only cares that when they flick a switch, or open a tap etc, they get a light turned on or a glass of water... they usually care not a jot how it gets there, who supplies it.

Mind you, it's better than this complete garbage phrase "serverless computing"... I mean... ok you are consuming (usually) microservices and you don't care about the underlying architecture, but still... silly name.

I think though, this is is more about a policy shift - there've always been sneaking suspicions from employers that if employees work from home, they goof off, watch TV, go shopping...basically anything other than work.

Turns out, that with no commutes, fewer constraints and a generally better work-life balance, productivity has increased.

So the question tends to become more geared towards - can we sell property? Can we reduce the amount we pay on rents, rates, utilities? Can we make this a permanent thing?

The biggest mistake I've seen with anything cloud-related is a management (and sometimes technical staff) thinking "Yup...someone else's problem now..." and have it bite them hard.

@Scott - from your earlier point about aerospace...do you work for a large Derby based company by any chance?

Edit: I just read the article and yeah...the 1990's called (ooh we have mobile data now...)
 
Top