16TB Cloud Databases (Part 1)

I could claim the purpose of this blog post is to talk about Amazon RDS increasing the storage per database to 16TB, and to some extent it is.  It’s also an opportunity to talk about the challenges of a hyperscale environment.  Not just the world of AWS, but for Microsoft Azure, Google Cloud, and others as well.  I’ll start with the news, and since there is so much ground to cover I’ll break this into multiple parts.

As part of the (pre-) AWS re:Invent 2017 announcements Amazon RDS launched support that increased maximum database instance size from 6TB to 16TB for PostgreSQL, MySQL, MariaDB, and Oracle.   RDS for Microsoft SQL Server had launched 16TB database instances back in August, but with the usual RDS SQL Server restriction of them not being scalable.  That is, you had to pick 16TB at instance create time.  You couldn’t take a 4TB database instance and scale its storage up to 16TB.  Instead you would need to dump and load, or use the Native Backup/Restore functionality, to move databases to the new instance.  If the overall storage increase for RDS was lost in the noise of all the re:Invent announcements, the fact that you could now scale RDS SQL Server database instance storage was truly buried.  The increase to 16TB databases benefits a small number of databases for a small number (relatively speaking) of customers, the scalability of SQL Server database instance storage benefits nearly all current and future RDS SQL Server customers.

While RDS instances have been storage limited, Amazon Aurora MySQL has offered 64TB for years (and Aurora PostgreSQL was also launched with 64TB support).  That is because Aurora was all about re-inventing database storage for the cloud. so it addressed the problems I’m going to talk about in its base architecture.  In the case of non-Aurora RDS databases, and Google’s Cloud SQL, Azure Database for MySQL (or PostgreSQL), and even Azure SQL Database (which despite multiple name changes over the years, traces its lineage to the CloudDB effort that originated over a decade ago in the SQL Server group) have lived with the decades old file and volume-oriented storage architectures of on-premises databases.

Ignoring Aurora, cloud relational database storage sizes have always been significantly limited compared to their on-premises instantiation.  I’ll dive into more detail on that in part 2, but let’s come up to speed on some history first.

Both Amazon RDS and Microsoft’s Azure SQL Database (then called SQL Azure) publicly debuted in 2009, but had considerably different origins.  Amazon RDS started life as a project by Amazon’s internal DBA/DBE community to capture their learnings and create an internal service that made it easy for Amazon teams to standup and run highly available databases.  The effort was moved to the fledgling AWS organization, and re-targeted to helping external customers benefit from Amazon’s learnings on running large highly-available databases.  Since MySQL had become the most popular database engine (by unit volume), it was chosen to be the first engine supported by the new Amazon Relational Database Service.  RDS initially had a database instance storage size limit of 1TB.  Now I’m not particularly familiar with MySQL usage in 2009, but based on MySQL’s history and capabilities in version 5.1 (the first supported by RDS), I imagine that 1TB covered 99.99% of MySQL usage.  RDS didn’t try to change the application model, indeed the idea was that the application had no idea it was running against a managed database instance in the cloud.  It targeted lowering costs while increasing the robustness (reliability of backups, reliability of patching, democratization of high availability, etc.) of databases by automating what AWS likes to call the “undifferentiated heavy lifting” aspects of the DBA’s job.

As I mentioned, Azure SQL started life as a project called CloudDB (or Cloud DB).  The SQL Server team, or more precisely remnants of the previous WinFS team, wanted to understand how to operate a database in the cloud.  Keep in mind that Microsoft, outside of MSN, had almost no experience in operations.  They brought to the table the learnings and innovations from SQL Server and WinFS, and decided to take a forward-looking approach.  Dave Campbell and I had spent a lot of effort since the late 90s talking to customers about their web-scale application architectures, and were strong believers that application systems were being partitioned into separate services/microservices with separate databases.   And then those databases were being sharded for additional scalability.   So while in DW/Analytics multi-TB (or in the Big Data era, PB) databases would be common, most OLTP databases would be measured in GB.  Dave took that belief into the CloudDB effort.  On the technology front, WinFS had shown it was much easier to build on top of SQL Server than to make deep internal changes.  Object relational mapping (ORM) layers were becoming popular at the time, and Microsoft had done the Entity Framework as a ORM for SQL Server.  Another “research” project in the SQL Server team had been exploring how to charge by units of work rather than traditional licensing.  Putting this altogether, the CloudDB effort didn’t go down the path of creating an environment for running existing SQL Server databases in the cloud.  It went down the path of creating a cloud-native database offering for a new generation of database applications.  Unfortunately customers weren’t ready for that, and proved resistant to some of the design decisions (e.g., Entity Framework was the only API offered initially) that Microsoft made.

That background is a little off topic, but hopefully useful.  The piece that is right on topic is Azure SQL storage.  With a philosophy apps would use lots of modest sized databases or shards and understand sharding, charging by the unit of work which enabled multitenant as a way to reduce costs, the routing being built above a largely unchanged SQL Server engine, and not supporting generic SQL (and its potential for cross-shard requests), Azure SQL launched with a maximum database size of 50GB.  This limit would prove a substantial pain point for customers, and a few years later was increased to 150GB.  When I asked friends why the limit was still only 150GB they responded with “Backup.  It’s a backup problem.”  And therein lies the topic that will drive the discussion in Part 2.

I’ll close out by saying that relatively low cloud storage sizes is not unique to 2009, or to Amazon RDS and Azure SQL.  Google Cloud SQL Generation 1 (aka, their original MySQL offering) was limited to 500GB databases.  The second generation, released this year for MySQL and in preview for PostgreSQL, allows 10TB (depending on machine type).  Azure SQL Database has struggled to increase storage size, but now maxes out at 4TB (depending on tier).  Microsoft Azure Database for MySQL and PostgreSQL is limited to 1TB in preview, though they mention it will support more at GA.  RDS has increased its storage size in increments.  In 2013 it was increased to 3TB, and in 2015 to 6TB.  It is now 16TB or 64TB depending on engine. Why? Part 2 is going to be fun.

 

 

 

 

 

 

Advertisements
Posted in Amazon, AWS, Cloud, Computer and Internet, Database, Microsoft, SQL Server

Amazon Seattle Hiring Slowing?

An article in today’s Seattle Times discusses how Amazon’s open positions in Seattle is down by half from last summer and at a recent low. I don’t know what is going on, but I will speculate on what could be one of the major factors. Let me start by covering a similar situation at Microsoft about 20 years ago. Microsoft had been in its hyper growth phase, and teams would go in with headcount requests that were outrageous. Paul Maritz would look at a team’s hiring history and point out that to meet their new request they’d need to hire x people per month, but they’d never exceeded hiring more than x/2 per month. So he’d give them headcount that equated to x/2+<a little>, and then he’d maintain a buffer in case teams exceeded their headcount allocation. Most teams would fail to meet their headcount goals, a few would exceed them, but Microsoft (at least Paul’s part) would always end up hiring less (usually way less) than the total headcount they had budgeted for. It worked for years, until one year came along where most teams were at or near their hiring goals and a couple of teams hired way over the allocated headcount. Microsoft had over-hired in total, and some teams were ahead of where they might have been even with the following year’s budget allocation. From then on there was pressure on teams to stay within their allocated headcount, both for the year overall and for the ramp-up that was built into the quarterly spending plans.

Could something similar be happening at Amazon? Could this be as simple as Amazon telling teams “no, when we said you can hire X in 2017 we meant X”, and enforcing that by not letting them post 2018 positions until 2018 actually starts? Amazon is always looking for mechanisms to use, rather than counting on good intentions, and having recruiting refuse to open positions that exceed a team’s current fiscal year’s headcount allocation would be a very solid mechanism for enforcing hiring limits.

It will be interesting to see if job postings start to grow again when the new fiscal year starts. That would be the most direct confirmation that this is nothing more than Amazon applying better hiring discipline on teams.

Posted in Amazon, Computer and Internet | 2 Comments

Sonos One

I’ve been a Sonos fan for years, and recently talked about my first experience using Alexa to control older Sonos equipment.  I have one room, my exercise room, that isn’t covered by the Niles multi-zone system.  I was using an Echo in there, but with the launch of Sonos One decided I had an option for better sound.  I swapped out a first generation Echo for the Sonos One.

Starting with the good news, the Sonos One sounds great, as expected.  While I find the Echo good for casual music play, the Sonos One is better when you really want to fill the room.  If you wanted to go another step Sonos supports pairing two like devices for stereo (separate left and right channels) playback.  If you have a Play:1, there are reports of it being possible to pair the One and Play:1 for stereo playback.  I have a Play:1 at a second home that isn’t being used, so it might just be coming to the ranch.  Also, I wouldn’t be surprised to see a “Sonos Three” and/or “Sonos Five” next year, but that is overkill for my current use case.

As an Alexa device the Sonos One is a little weak, at least currently.  It supports most, but not all, of the functionality of the Echo.  Since the device is so music-centric that may not be a problem, but caveat emptor.  For example, last time I tried Flash Briefing (News) it wasn’t supported though Sonos said it was coming soon.  Getting the news during a morning workout is something I want.  Alexa Calling and Messaging isn’t supported, and that may not show up for a long time.  If you want my personal speculation on that one, Amazon may be reluctant to share your contacts with a third-party device.  So a design that worked on first-party devices like the Echo wouldn’t easily adapt to those using Alexa Voice Services (AVS).  Of course, in time Amazon could find a solution.  Sonos emphasizes that the Sonos One will continue to be updated in the future, going so far as to say “Both Sonos and Alexa keep adding new features, sound enhancements, services and skills to make sure both your music and voice control options keep getting better. For free. For life.” For one of the most obvious examples, Spotify wasn’t available at launch but support was added just weeks later.

My big beef with the Sonos One is that the far-field microphone array doesn’t seem to be working very well.  Now this confuses me, because when I tested it on arrival it seemed just fine (though not up to the Echo).  That is, I could give a command while music was playing and it would hear and respond appropriately.  This morning I was listening to music and the Sonos One was almost uninterruptable.  I finally tried talking so loudly that the Echo Show in my office started to respond, while the Sonos One just a few feet away continued to ignore me.  After my workout I applied the latest update from Sonos, which claimed to fix a bug introduced in the previous update.  The next time I workout I’ll see if that made the One’s microphone array more responsive, and update this posting.

Update (12/13): I’ve had more success with the Sonos One’s microphones on subsequent occasions, so either my Sonos One needed a reboot or it needed the software update I mentioned.    They still aren’t as responsive when playing music as the Echo, but work well enough that I had no real problems switching what we were listening to across an entire afternoon of cleaning our basement.

Bottom line:  If you are buying primarily for music playback than the Sonos One is a good alternative to the Echo.  But if music is more of a secondary use, then you are probably better off with one of the Echo devices.

For another take on the Sonos One, I found the Tom’s Guide review useful.

 

Posted in Amazon, Computer and Internet, Home Entertainment | Tagged , , | 4 Comments

Good browsing habits gone bad

We always think that the best protection against web-distributed malware is to exercise caution while browsing.  But what if you aren’t even browsing in the classic sense, and an application renders a malware infested page?  I found out this morning.

I grabbed my first cup of coffee this morning and launched the Windows 10 MSN News app.  I’d been reading stories for about 30 minutes when a story in my “Microsoft” search tab caught my eye:  “Microsoft Issues Black Friday Malware Warning”.  It showed as being from the International Business Times, not one of the obscure sites that MSN News sometimes picks up.  I clicked on the tile and started reading.  Suddenly my Surface Book 2 started talking.  The coffee wasn’t yet working so I couldn’t quite make out what was being said, but I thought “%^*%” auto-play video, so I clicked the back arrow to get rid of the page.  The woman with the English accent didn’t stop talking.  I killed MSN News, still she droned on.  I clicked on Edge and there it was, the MSN News article had somehow launched a browser tab with some kind of phishing/ransomware/malware site.

What the woman was saying was something about my computer was found to have “pornographic malware” and that I had to contact them.  I saw that the web page had a phone number on it, and darn but I was too busy trying to kill this to write it down.  On top was a modal dialog box:2017-11-24You’ll notice there is no checkbox for “prevent web page from launching dialog boxes”, or whatever Edge says.  I killed the dialog box and saw that underneath was another dialog box with that checkbox.  But before I could check it the above dialog box was back.  At one point I did check it in time, only to have the web page try to go to full screen mode.  Fortunately Edge let me block that.  So this second dialog was apparently a fake as well.

Unable to do anything to kill this from within Edge I launched Task Manager.  I really wanted to keep my other tabs so I tried killing just the process for the malicious one.  It didn’t work, it just kept re-launching.  I killed the top level process, re-launched Edge, and killed the malicious tab without opening it.  Nope, that wasn’t enough.  The malicious page came back to life.  I went through the whole thing again and this time clicked on the tab to start fresh.  Then I went into settings and cleared everything.  This finally seemed to stop it.

Next came a scan, then an offline scan, with Defender.  I followed that up with a Malwarebytes scan.  Nothing.  It looks like Edge managed to keep this beast from breaking through and making system changes, but I’m not confident about that yet.  I’m going to take a deeper look before declaring victory.

Maybe the worst part of this is I have no way to report it to Microsoft, or anyone else.  I couldn’t copy the offending URL from the address bar because of the modal dialog.  And I discovered that when you go into Edge’s browser history you can either re-launch the page or delete the history item, but you can’t Copy the link.  I spent some time looking around to see if Edge stored history in human readable format, but eventually gave up.  I don’t see a way to report the bad story in MSN News, but now I’ll go try to find it elsewhere.

Bottom line: Don’t think that good browsing habits will save you.  I’ve been using the MSN News app since it was first released with Windows 8, with this being the first malicious story I’ve found.  And it was an infected web page on a mainstream site.

Update (11AM Eastern): I scanned the IBT web page for this story using several tools, such as Virustotal, and came up blank on any malware.  So I viewed the story directly.  Nothing bad happened.  So while the problem occurred while I was viewing the IBT story in MSN News, it isn’t clear what really caused the malicious page to launch.  Also went and checked the family member’s WiFi router I’m on and discovered it wasn’t up to my standards for security settings.  I hardened that up.

Posted in Computer and Internet, Microsoft, Security, Windows | 8 Comments

Product Launches and Vendor Conferences

My favorite tech conference, AWS’ re:Invent is coming up next week.  I attended the last three as an Amazon VP, and was hoping to attend this year as a customer, but unfortunately can’t make it.  I’ll be streaming the keynotes, but really would have loved to experience the dynamics from the other side.  I’ve had that (both employee and customer) experience with some of the Microsoft conferences. So hopefully next year with re:Invent.  If you are going to re:Invent for the first time you are in for a treat.

If we go back 40 years ago there really weren’t vendor conferences.  There were industry conferences such as the Joint Computer Conferences (JCC), and user group conferences such as DECUS and SHARE.   DECUS was technically owned by Digital Equipment Corporation (DEC), but other than DEC providing administrative support, it was run by volunteers.  I attended many DECUS conferences.  First as a customer and later as an employee.  Thirty years ago COMDEX replaced the JCC as the big industry conference.   At the same time vendors were beginning events of their own.  While DECUS continued on as a user group, DEC held it’s first DECworld. DECworld ’87 set the standard by which today’s vendor conferences are (or should be) measured.  COMDEX peaked nearly 20 years ago, and by the early 2000s there was a clear bifurcation in approach between IT and Consumer focused conferences.  The IT audience would be addressed primarily by vendor conferences while the Consumer ecosystem would be addressed by industry conferences such as the Consumer Electronics Show and the Mobile World Congress.

Conferences have always served as major venues for product and technology unveilings.  Most people don’t realize, or don’t know, how profoundly the Internet has changed how products are launched.  Forty years ago you held a press conference, or just sent out a press release.  It would take a week or two for it to appear in Computer World or Electronic News, which were weekly newspapers, and a couple of months later it would appear in magazines such as Datamation.  If you were an existing customer you’d probably here about a new product from your sales rep, since all sales were direct in those days.  Industry Conferences became increasingly important for product launches for one major reason, press coverage.  Only in rare circumstances could you get a large number of members of the press to travel to your press conference, but they all attended COMDEX.  Although the Internet now allows for virtual attendance to press conferences, the depth of engagement is still better in-person.  And if you are a small vendor, no reporter is going to attend your press conference.  But if they are already at an industry conference looking for interesting things to write about, well then you have a shot.

Big vendors realized a couple of things.  First, they had little trouble gaining press attention on their announcements, particularly with the advent of the Internet, without the high costs of participating in industry conferences.  Second, a show of their own allowed for much better direct engagement with their customer base.  Consider that for IT executives, developers, etc. tracking and deep learning about new vendor products and services is a side job.  Few are going to have that in their annual goals.  Normally you are trying to get a few minutes of their time between their worrying about if the website changes are going to be ready for Black Friday, and how they are doing on their budget, to pay attention to your new products or services.  But at your own conference you get a day, or two, or three where you are their day job.  They can pay attention to news about new offerings, and gain deep knowledge in both those and existing offerings.  Now they can engage with other customers on how your products and services are being used.  Now they can engage with your partner ecosystem to find solutions for their business problems.  Now they can take the time to dig deep on areas of interest.  Now you can get an amazing amount of direct feedback from customers in a very efficient way.  Vendor conferences are primarily about deeper customer engagement, but while you have their attention it is the ideal time to introduce new offerings.  And of course, most customers attend to hear about new things.  You need both the meat (e.g., deep dive sessions) and the sizzle (e.g., keynote launches) for your conference to succeed.

I don’t think most vendors target their product cycles specifically to their conferences, but it is a somewhat natural part of their annual life-cycle.  For example, notice how Microsoft’s three major conferences bracket the end of their fiscal year.  Build is 6 weeks before the end of the fiscal year, Inspire is a week into the new fiscal year, and Ignite is late in the first quarter.  Given planning, budgeting, and goal setting in any company tends to be around fiscal year boundaries you just naturally tend to have more to say at the end of a fiscal year than in the middle.  If in the spring you have planning discussions and make product decisions, and the headcount to deliver doesn’t formally materialize until July 1, and then you have to ramp up, your new deliverables are going to be more heavily weighted in late spring or early summer of the following year.  Moreover, the conferences do act as a forcing function in that teams really want to be able to launch at one of those conferences.  So they will go the extra mile to be ready.

Amazon’s fiscal year is the same as the calendar year, so it ends December 31st.  re:Invent, which covers the purposes of all three of Microsoft’s major conferences, comes a few weeks before the end of the year.  You have the same dynamics of the planning/budgeting cycle and teams wanting to be ready by re:Invent.  You have an additional dynamic of the goal setting process that Jeff Bezos has talked about in shareholder letters.  You are coming down to the wire on meeting year-end goals, so you are making the extra push to finish things up.  The result is an explosive set of announcements ready to go.  There are so many  that AWS has gone to doing many just before re:Invent.  This year there were 85 launches in the week and a half  leading up to Thanksgiving, up from 56 in 2016.  If you saved those launches for re:Invent itself many, if not most, would be lost in the noise.  Despite them often being the most customer impactful of the years’ launches.  Adding storage scaling to RDS SQL Server is transformational for customers using, or interested in using, RDS for their SQL Server workloads.  But how does it compete for attention on a scale of rolling out a tractor trailer (Snowmobile) during Andy Jassy’s keynote last year?  Better it was launched just before Thanksgiving than at re:Invent.

While there are likely more of the modest sized launches coming next week, what we are all waiting to see are the big launches that re:Invent is known for.  For those, and the rest of the re:Invent 2017 keynotes, you can live stream them here.

 

Posted in Amazon, Computer and Internet, Microsoft | Tagged | 2 Comments

Surface Book 2

I picked up my Surface Book 2 (SB2) from the UPS Store last Friday and wanted to provide my impressions.

Setup was in the Wow category.  You turn on the SB2, connect to WiFi, answer a couple of questions, log into your Microsoft account, and a few minutes later I’m looking at the same lock screen as on my other PCs.  This is really a testament to Windows 10 improvements, but when combined with the speed of the SB2 the experience is almost scary good.  Next I signed the SB2 up for Windows Insider builds then went back to tuning up the system.  I’d expected it would take a day for the Windows Insider build to come through, but it started downloading immediately.  That too was an unusually good experience.

One of the most impressive things about the setup process, but also the most confusing, is what to do about Microsoft Office.  The SB2 comes with the Microsoft Office apps preinstalled.  It also has the Get Office app prominently displayed.  As an Office 365 Home subscriber I was left confused on what my next step should be.  Do I go to office.com and add the machine, which would trigger an Office 365 install?  Do I click on Get Office? That is really counterintuitive given Office is already there.  Then it occurred to me that the Office apps already have the ability to log in to your Microsoft Account, and I wondered what would happen if I just did that. I ran Microsoft Word and logged into my Microsoft Account and it automatically configured the SB2 up as one of my Office 365 machines.  Basically there is no setup needed for Office 365, just log in.  IF you know that’s all you need to do.  Next up for Microsoft, find a way to make this clearer.

I’d picked up the SB2 mid-afternoon and, despite having a lot of other things to do that day, by my normal bedtime it was completely ready for use and customized like I’d been using it for months.

Although it has only been 4 days, the SB2 has been rock solid.  When I got my original Surface Book a couple of years ago there were signs of flakiness (crashes, failure to sleep, etc.) right out of the gate.  I’ll know better in a few weeks, but initial impression is that Microsoft took the reported quality issues in the Surface line to heart before releasing the SB2.

On the batter front I did no formal measurement.  What I did was charge the SB2 overnight Saturday and then see how long I could go between charges with just normal use.  It was a lighter usage period than normal for me, still it was Tuesday night, with 32% battery still remaining, when I decided to charge again.  Even with the original SB I would find colleagues with Macbooks bringing chargers to day-long meetings while I would squeeze by without one.  With the SB2 I’d never feel the need to carry a charger with me for the business day.  And could easily see myself going a couple of my normal usage days before needing a charge.

One of the original SB complaints was that the screen gave a little too much when you touched it.  The SB2 screen seems stiffer, much more comparable to regular notebooks with touch screens.  Another complaint was lapability.  Since the SB/SB2 “screen” is a full tablet with its own battery it is top heavy.  With the SB this meant that if you placed it on a surface with even a slight backward slant, such as your legs when sitting, when you lifted your hands from the keyboard it would fall over backwards.  The SB2 seems a little more balanced, but not a lot.  If I kept my legs square while sitting then it was great on my lap.  But if there was a slant towards the back you could see the front edge of the base unit start to lift up.  So you need to be a little careful when using the SB2 on your lap.  My lapability rating is “acceptable”, but if you are a heavy lap user there are better options.

Overall, I’m extremely happy with my SB2.  Maybe after a few weeks of real use I’ll find something negative to say, but right now it gets a 5/5 star rating from me.

Update (11/23): I realized I used the SB2 on my lap for a few hours last night without experiencing, or even thinking about, it tumbling off.  Just want to be clear that lapability really is acceptable.  

 

 

Posted in Computer and Internet, Microsoft, Windows | 6 Comments

Where is my Surface Book 2?

This is not a complaint, it is me anxiously waiting for a new device in a way that I haven’t for years.  You see, I have been without a PC for about 6 weeks.  Until then I had my work-owned Surface Book.  I never used it for personal things, but then I was too busy to do much anyway.  An iPad Pro, or the occasional use of a shared PC, handled my personal needs.

I’m writing this on my wife’s HP Envy all-in-1.  It’s very nice, but I wouldn’t dare sign it up for Windows Insider builds.  Down in my office I have her previous all-in-1, but it is super flakey and almost unusable.  A person-week of trying to fix it got me back to “super flakey and almost unusable”, so it is abandoned.  I need something I can make mine.  I can take Windows Insider builds.  I can set it up as a developer workstation.  I can use WSL and get a nice Ubuntu Linux environment running.  I can play with all the other neat stuff Microsoft has done in the last few years.

My most pressing need was for a new notebook.  I’ve always been a fan of the Surface Pro, so that was the default choice.  Then Microsoft introduced the Surface Laptop.  I still plan to carry a tablet with me, so didn’t need the tablet aspect of the Pro to the extent of the old days, when I carried just one device with me.  If you don’t care about having a tablet, then why not benefit from having the better keyboard?  The lapability of the Surface Laptop is much better than the Pro, so that was attractive. I started to lean towards that.  Or a Dell XPS 13.  My wife has a 3 year old XPS 13 and it is a wonderful thin and light notebook.  It is by far the notebook she has been the most satisfied with over the years.  The new ones seem even better.  Concerns about the quality of Surface devices kept the Dell in contention.

I knew another Surface-related announcement was coming, but wondered if I could really hold off.  Fortunately I did, because the Surface Book 2 solved another dilemma for me.  Should I buy a notebook and a desktop?  I really didn’t want another desktop as the Cloud has taken over many of the needs for one.  But I still wanted one high-powered PC.  The Surface Book 2 tips the balance.  Add a Surface Dock when I’m in my office and I’ll never know I don’t have a desktop.  How can I say that since I don’t have the Surface Book 2 yet?  Remember I spent a couple of years living with a Surface Book (and the AWS Cloud of course) as my only work PC.

So hopefully the Surface Book 2 will arrive tomorrow as scheduled.  Alexa will let me know the moment it hits the UPS Store, and I’ll be in my car 5 minutes later!

Posted in Computer and Internet, Microsoft, Windows | 2 Comments