Finally a Brand New WINDOWS PHONE!

Ok, I lied.  There is no Windows Phone, this is about the Apple iPhone X.  It isn’t even about the iPhone X, it is about clairvoyance.  One of the signature features of the iPhone X is Windows Hello.  Oh, sorry, I mean “Face ID”.  And Qi-based wireless charging. I think I still have some Qi wireless charging plates lying around from a 4 year old Lumia 1020 Windows Phone. Ok, so the iPhone X is full of technology pioneered by Microsoft and its ecosystem years earlier.  The only reason to be snarky is because of a long tradition of Apple and its fanboys claiming Apple is innovative and Microsoft is not.  I should be excited that these technologies are now available in the phone ecosystem I use.  Very excited.  So why do I feel let down?

I think about the iPhone X and picture how it will change my life 6, 12, or 24 months from now and draw a blank.  The first 10 years of iPhone marked a profound change in how I live my life.  In how nearly all of us live our lives.  Even those without smartphones have experienced the change.  My mother experienced her first Uber ride a few months ago.  She didn’t have a smartphone, but my cousin did. And she is only marginally aware of how it contributed to the restaurants we’ve chosen, her healthcare, her travel, or dozens of other things we’ve done on her behalf using a smartphone.

The technological advancements in smartphones the last decade are breathtaking, even when we take them completely for granted.  Take a current dilemma for the military, the potential that they will be denied access to GPS signals during a conflict.  For 30+ years militaries, particularly those of the U.S.A., have used GPS at the core of navigation.  It allows them to feed an accurate current position into Inertial Navigation Systems.  But GPS signal access can be denied, or potentially spoofed, and DARPA (the guys who brought you everything from the Internet to Stealth) have a project to provide accurate positioning information in GPS-denied environments.  One element of that program is to triangulate on existing radio signals.  Well, how does “GPS” work in today’s smartphones?

To reduce battery drain smartphones run with their satellite GPS receivers turned off.  However they are always looking for, and connected to, a nearby cell tower with the strongest signal.  So they can see one or more cell towers and they know where they are located.  When an application asks for location, the smartphone very quickly gives it a location that is triangulated from the cell towers it already looking at.  That isn’t as accurate as GPS, but it is good enough for many uses. Then it looks at all the available WiFi signals.  It can access a database of known WiFi hotspot locations and triangulate on those.  Finally, if the app has asked for maximum location accuracy, the phone fires up its GPS receiver and tries (since GPS signals don’t penetrate buildings, or the “canyons” created by skyscrapers, well) a satellite-based location.  It fuses all this information to determine a very accurate location for your phone.  You can actually see this in action in many location-based applications.  You fire up Uber and watch as the pin moves from a location tens or hundreds of feet from where you are standing to almost exactly where you are.  Map apps may also show a blue circle around the location, indicating accuracy.  The circle shrinks as a more accurate location is determined.  Many applications go one step further by having a database of known points of interest, such as hotels, restaurants, popular office buildings, etc.  So when you ask for an Uber and the cell tower triangulation says you are 50 feet from an Italian Restaurant it assumes you are at the Italian Restaurant.  I can request an Uber from inside my condo.  Location services always starts with a location around the corner from my building entrance, but the Uber app is smart enough to know I’m likely requesting a car from the Home address I’ve saved.  As a result, if GPS satellites were to suddenly all stop working, most location-based apps would continue to work much as they do today.

The maturity of location services in smartphones, and of course the apps built on them, has changed our lives.  Does the iPhone X offer anything with similar potential?  I don’t think so.

What about Augmented Reality?  That isn’t something unique to the iPhone X, but indeed has long term potential.  I may not care much about it, except as a curiosity right now, as I’m not a gamer.  But denying ARs potential would be like claiming GPS was only useful for back country hikers a decade ago. Learning that the sleepsofa we bought couldn’t make it into a guest room no matter how the delivery guys twisted and turned it, before we bought it, would be a game changer.

So why would I part with $1000 for an iPhone X.  There are many times I’d like a screen the size of my iPhone 7s Plus in a smaller package.  I’m having trouble convincing myself that is worth $1000 when the 7s Plus is no new.  I would normally wait for the Xs to make a move.  And there is a potential reason not to buy an X, the lack of a fingerprint reader.

The only reason I really care about a fingerprint reader is that Windows Hello, I mean Face ID, seems really inconvienent for Apple Pay.  What’s the sequence?  I do the usual dance trying to find the right spot on the payment terminal for Apple Pay to launch, then I have to hold it up to see my face, then I have to do the dance again to find the right position to register the payment?  I’ve gotten good at the current sequence, where I find the right position while my thumb hovers over the start button then just give it a touch when Apple Pay launches.  Forcing me to use Face ID (or a pin) instead probably means its more convenient to go back to pulling out a credit card.  I could wait and see if the Xs introduces a screen-based finger print reader, which was something Apple reportedly dropped from the X because they were having trouble getting it to work in time for launch,

So I’m not too excited by the iPhone X, and emotionally lean towards waiting for an Xs.  But that isn’t the end of the story, since my interest may grow over time.  While I have often pre-ordered devices, that hasn’t been the case with iPhones. With iPhones I find my excitement builds over time and I tend to buy them a few months after introduction.  So if February comes around and I’m carrying an iPhone X, don’t be too surprised.

Meanwhile, I was talking to a friend about what would really get me excited about mobile phones again and I gave him a one-word answer: clairvoyance.  He thought I was joking. Fortunately we both believe in Clark’s Laws so be prepared for magic, I mean sufficiently advanced science.  Why do I have to use a finger print or Face ID at all when making a payment?  Or to access the phone at all?  Clairvoyance or bust.

Behavior approaching clairvoyance is already something we are familiar with.  Android and iOS already will figure out your home and work addresses based on behavior.  Waze knows I’m going to/from work and home based on time of day and location I start a journey.  That used to only happen with my pre-designated Home and Work.  But lately I’ve noticed that it handles going from my Colorado home to Colorado work location, even though what is programmed in are my Seattle home and work locations.  Figuring things out from information we provide (e.g., a calendar entry with a meeting location in it) is just good programming.  Deriving facts and projecting behaviors seemingly out of thin air?  Clairvoyance.

Of course clairvoyance is a marriage of sensor data, historical behaviors, and cloud-based AI models.  Mobile phones play a very small part, providing a portion (large today, smaller tomorrow) of the sensory input and serving as a human-network communications interface.  So it is entirely possible to deliver clairvoyance without requiring a new mobile phone.  But, like most things, having the right design center yields a superior experience.  A mobile phone with delivery of clairvoyant experiences as the design center?  That I could get excited about.  It ain’t the iPhone X.

 

 

 

Advertisements
Posted in Cloud, Computer and Internet, Linux and Android, Microsoft, Mobile, Windows Phone | 5 Comments

CUJO Firewall: Approach with Caution

In my previous post I did an introduction to whole-home Internet security.  For the last few months I’ve been trying to get one of the early devices in this category, the CUJO working.  I have now tried CUJO in two totally different networking setups, and failed to get it working properly in either. I had total failure in one and a partial failure in another.

What sets CUJO apart from other devices currently hitting the market is that it is a separate security firewall, not a new router or router feature.  That suggests you can upgrade, rather than replace, your existing network.  Given some may have unusual networking needs to address, the CUJO approach would seem to offer the flexibility that some of us need.  It also would seem to address the problem of how to add security to the modem/router/access point that your internet provider supplies.

I have three home network environments that I can use to try out new products.  The first is as simple as it gets, and likely represents the environment of most Americans.  It is a 2-bedroom condo with Comcast Xfinity as the Internet provider.  Since it is small, the Xfinity combination cable modem/router/access point is completely adequate and thus its base networking setup is extremely vanilla.  The second environment has Centurylink as its Internet provider.  It uses a Centurylink-supplied Zyxel modem/router/access point, except I turn off the AP in favor of using a Netgear Orbi for WiFi.  Though not purely an Internet Provider default environment, it doesn’t stray too far either.  The third environment is very complex due to the lack of any landline Internet provider.  As I didn’t try the CUJO there, I’ll leave out the details.  All three environments contain a number of IoT devices, which leads to complexity on the LAN side.  For example, there are definitely too many ZigBee/Zwave/BLE bridges because despite using standardized protocols many vendors requires a bridge of their own.  But again, that is another story.

My first attempt to get the CUJO working was in the condo, which I often use as a test bed because…my wife is infrequently there so she will never know how badly I mess things up!  There, I said it.  So I get my CUJO, watch the videos, read everything I can find, and get ready to set it up.  I discovered that with Xfinity you need a real hack to get CUJO to work.  I’m not above hacks, so I go ahead and follow the instructions to get it working.

Before we get into that let me summarize how I understand CUJO gains access to your network traffic.  You disable the DHCP server (the thing that hands out addresses to each device in your network, like 192.168.0.23) in your router and let CUJO serve up DHCP addresses instead.  Along with the DHCP address CUJO provides the LAN-side address of the Gateway that the device should talk to in order to send data out over the Internet.  CUJO tells every device to use it, rather than the router, as the gateway.  That way every network packet to and from the device goes through the CUJO.  CUJO can then sniff packets for malicious content, block accesses to bad URLs, and monitor for unusual communications patterns that might indicate a device has been compromised.  For those who really care, CUJO also sets itself up in a sort-of double-NAT environment.  It sets itself up as the gateway at 192.168.0.1, with DHCP handling out 192.168.0.x addresses, and changes your router to sit at 10.0.0.1.

The problem with Xfinity-supplied routers is that you can’t turn off their internal DHCP server.  So instead CUJO came up with the hack I linked to above.  I tried the hack on my router and could never get CUJO to work (it would always end up with its LEDs making the frowning face).  CUJO has easy access to support.  I talked to support, tried a few things, then let them connect into my router to try to get it configured.  We never did get it to work, and the technician suggested I try doing a factory reset of the router then install CUJO again.  At this point I’d spent most of a day on the problem, and decided I couldn’t face all the things that could go wrong with a factory reset.  So I managed to undo the CUJO hack and get my home network working properly again.

The CUJO sat in my cabinet for months, until I realized it was there and that Xfinity had since sent me a newer generation modem/router/AP.  By definition, it had been “reset”! Of course they didn’t make it possible to disable the DHCP server, so CUJO’s hack was still necessary.  Having a couple of hours before I needed to leave for the airport, I reset and again installed the CUJO. The results were no better, once set up according to CUJO’s instructions my home network became completely inaccessible. And without a functioning DHCP server, even after removing the CUJO I struggled to regain access to the Xfinity router and return it to its proper configuration. I had to delay my departure for the airport, and nearly missed the flight, to get my home network back working properly before leaving. I could have tried calling CUJO support again, but I’d already spent way too much time on this device and was running up against that deadline.

Instead I’ve concluded that if you have Comcast Xfinity, don’t go anywhere near the CUJO.  I’m not saying you can never get it to work, just that it is not worth the likely aggravation of trying.  Xfinity subverts the basic mechanism that CUJO uses, and the hack means you are caught in the middle.  You could also replace the Xfinity-supplied modem/router/AP with separate non-Comcast components, and add CUJO to that, but it isn’t a normal consumer thing to do.

With CUJO and Xfinity not playing together I decided to try the CUJO on my CenturyLink network. I knew the Zyxel modem/router/ap allowed you to turn off DHCP, so it should work the way that CUJO is designed for. I followed the instructions in the CUJO app, which automatically configured the Zyxel.  Over a few hours all the devices in my house were recognized by CUJO, and were working correctly. All except a couple of WiFi security cameras. I waited 24 hours for their TTLs to expire to be sure they reached out to DHCP for a new address, but that didn’t help.  They just wouldn’t join the network.

I looked at the instructions for my cameras and they mentioned a number of cases where they could lose WiFi connectivity (e.g., switching network gear). I took one of them and went through its process for connecting to a new network. To make along story short, it was unable to get an IP address from DHCP. I did a hardware reset on the camera and tried the reconnect, no luck. I tried unplugging and re-plugging in the Zyxel, CUJO, and Orbi.  Well first my entire network got screwed up.  Nothing could connect to WiFi for the longest time.  My wired Ethernet connected PC picked up completely bogus information that I couldn’t clear with any imaginable combination of IPCONFIG commands, or the network troubleshooter.  It appeared that it had gotten some information from the router’s DHCP and some from the CUJO’s DHCP, which would make sense later.  It took a reboot to regain a useful Internet configuration.  I then tried to connect the WiFi camera again, with the same result.  It couldn’t get a valid IP address.

At this point I’d wasted hours and was no closer to getting CUJO working correctly, so I decided to remove it from my network.  When I went in to turn back on the DHCP server in the Zyxel I was surprised to find it was already on.  I don’t know if the CUJO app had failed to turn it off, or if the Zyxel somehow turned its DHCP server back on.  I thought about just turning it off, and manually verifying the configuration was proper for the CUJO, but realized it would take days to be confident that the change would “stick” and be comfortable the CUJO was working properly.  So instead I changed the Zyxel back to sitting at the gateway address, removed the CUJO, and rebooted the Zyxel and Orbi.  My network came back with all devices working.  I had to finish the setup of the camera that I’d done a factory reset on, but this time it went smoothly.  Now I had failure with CUJO on both Xfinity and Centurylink.

So conclusion number two is that even though CUJO has tried to make setup consumer friendly, it just doesn’t work reliably enough.  I’m even more sure I could have gotten the Centurylink setup to work properly, if I wanted to spend another few hours between setup and testing, than I was with the Comcast Xfinity setup.  But I worry it is all too fragile.  Because of that, I just don’t think I’ll be giving the CUJO another shot.

 

Posted in Computer and Internet, Security

Whole-home Internet Security

Over the next several months I’ll be returning to blogging about one of my favorite topic areas, Internet Security and Privacy.  For this post I’ll do some background on the new generation of whole-home internet security devices.  Then I’ll do another post about the first new device I’ve used, the CUJO.

I’ve been seeking ways to enhance the security of the Internet for myself, my family, and my home for many years.  For example, back in 2011 I wrote about how you may need multiple anti-malware products to adequately protect yourself.  And in 2012 I wrote about the use of enhanced DNS offerings as an added layer of security for web browsing.  Please note that both postings are dated and contain suggestions I wouldn’t make today.  Web of Trust went through a rough patch over privacy issues. I still use it to check out suspicious sites but usually don’t run with it always monitoring my website activity.  Both OpenDNS and Immunet were acquired by Cisco and, as a result of Cisco’s business focus, have questionable futures for consumer use.  As a warning sign, an increasing number of links for the consumer OpenDNS website are broken.  In the case of OpenDNS, I’d already recommended Norton ConnectSafe instead.  Immunet was unique in its support for running concurrently with other anti-malware, but fortunately there is a better approach now.

It looks like we are entering a new Golden Age of home internet security offerings, and I hope they actually prove to be as golden in the protection they offer.  These new devices, from add-on devices such as CUJO, to mesh routers with optional add-on security services such as EERO , to security company offerings of routers (Norton Core, Bitdefender Box 2) that are finally bringing enterprise-like network edge security to the home. Why now? We have four trends coming together. 

On the demand side, the Internet of Things (IoT) is placing large numbers of devices in our homes.  These devices can’t run a full-suite of security software, they may not be updateable (i.e., to fix vulnerabilities), and their market lifetimes are short (so they may not receive security patching support even if technically updateable) even though their usage lifetimes may be long.  In other words, the WiFi lightbulb I buy today may be replaced by a new and incompatible model next year, but I’ll still be using it 5 years from now.  Last year we saw how these IoT devices could be compromised, and in this case hijacked to create a large DDoS attack.

The second trend is the cloud.  As cloud capabilities grow, the ability to use it to enhance security grows as well.  For devices to be applicable for home use they have to stay in a consumer-friendly range, say under $300 for early adopters and under $100 at full adoption.  By moving more resource intensive processing to the cloud, vendors are able to offer capabilities to devices at these price points that would otherwise cost $1000s. Of course cost efficiency of the cloud is just one way to look at it.  The cloud enables computations, on large data sets, that just aren’t possible in other environments.

The third trend, also enabled by the cloud, is the maturity of machine learning.  Put (over)simply, with machine learning you feed a model a set of known malicious examples and a set of known benign examples.  It learns how to tell the difference, so when you give it a sample that is completely unknown it can tell if it looks malicious.  The more examples you feed it, the better it can distinguish between good and bad.  The training of the model is hugely expensive and is done in the cloud.  The resulting model it generates is relatively small and fast and can live on a modest device like a home router (or the anti-malware suite running on a PC).  The router also reaches out to the cloud when it encounters something suspicious, but not clearly malicious.  And this isn’t just about analyzing executable software, you can do the same thing with network traffic.  So if the model learns what the normal network traffic to and from company A’s lightbulb looks like, then it can block suspicious traffic to or from an A lightbulb.

The fourth trend is simply Moore’s Law.  The computational power available in a device for under $300 has grown to the point of being able to fully inspect network traffic, maintain and processes expanded sets of rules, run the models output by machine learning, process automatic updates, etc.

Of course this all works for more general purpose computing devices (PCs, tablets, and phones) as well as “things”.  So while I wouldn’t suggest removing all anti-malware software from your general purpose devices, the added layer of protection from placing security at the edge of your home network is worthwhile even if you have no IoT devices.  If there is a Zero Day attack circulating for your PC, you have at least two chances (the network edge and the anti-malware running on your PC) to block it while waiting for the vulnerability to be patched. Another example, your carrier may not update your Android phone quickly enough  to protect against a known vulnerability, but that vulnerability could be blocked from ever reaching your phone.  At least while you are connected in your own home.  Therein lies the weakness of edge protection, which falls into the category of useful, but certainly not sufficient, for mobile devices.

Despite my enthusiasm for a network-edge solution for home Internet I see two major roadblocks ahead.  The first is that whole-home security solutions typically require an annual subscription for their cloud-based services.  What is the price point, or combination of price points that will appeal to a broad spectrum of consumers?  How long is the included subscription?  A month, a few months, a year?  Is there a basic level of free service and then paid enhanced services?  Etc.  This is one way vendors will differentiate themselves, and we run a significant risk that these network edge devices will become like PC anti-malware software where the subscription runs out and the devices are not updated to deal with new threats.

The second roadblock is that most people simply obtain a modem/router/access point single box solution from their internet provider.  Until those providers start including these next generation whole-home security features, adoption will not spread far from early adopters and those they directly influence.  In fact, even for an early adopter the internet providers may put roadblocks in your way of incorporating the latest security devices.  You need a major hack to use CUJO with Comcast Xfinity, which I’ll talk more about in my next post.

 

 

Posted in Computer and Internet, Security | 2 Comments

Maybe Consumer Reports is right about the Microsoft Surface family

I’ve been watching all the uproar, and denial, from Consumer Reports (CR) dropping its recommendation of Microsoft’s Surface line, but I have to tell you I think CR might be on to something.  Sure there are plenty of power users who report they’ve never had a problem with the Surface.  Even when they have, which I’ll get to in a moment.  They, and the press, just seem to want to ignore that CR operates off broad survey responses rather than the experiences of a select few.  Of course CR’s data is backward looking, and won’t reflect improvements that Microsoft has made over the last 6-12 months.  And I’ve always believed that CR’s data is biased by who they attract as subscribers.  Nevertheless, it is a valid data set.

So why do I think that CR might be right about the Surface family?  Well let’s go with my last three experiences.  The first one was when I purchased a Surface 3 (not the pro) to use as a consumer tablet (i.e., alternative to an iPad).  Subjectively I’d say that the Surface 3 did not even perform as well as my original Surface RT.  It was sluggish and I thought kind of flakey.  Then several months in the touch screen stopped working reliably.  A few weeks later I happened to catch it in a funny light and found a hairline fracture in the screen.  I didn’t remember dropping it, though I likely bumped it at some point, so this was almost certainly my fault.  I looked at the cost of repair, and it was so close to replacement cost that I did replace it.  With an iPad Pro.  I’ve dropped that a few times, and it still is working just fine.  Now take an average consumer.  The Surface 3 never was a satisfying experience.  It was not a physically robust device.  It was not affordable to repair.  How would that consumer respond to CR’s survey?  Never mind an average consumer, how do you think I would have responded to CR’s survey?

Next let’s take the Surface Book.  Like just about all owners of more recent Surface devices, I had the experience of my Surface Book failing to Sleep or coming out of Sleep all on its own.  Even power users who have tweeted they don’t agree with CR admit they had that experience with Sleep in the past.  If we talk average consumer, don’t you think that experience might lead them to be a little negative on the device and respond to CR that they’d had problems?  Now my personal experience is even worse.  I twice had the experience of putting my Surface Book in my backpack, heading to the airport, and having it come out of Sleep on its own.  In both cases it cooked itself for hours.  How hot did it get?  Well, when I reached into the backpack I burned my fingers!  After the second time the Surface Book’s screen was permanently damaged, with brownish yellow streaks along the right side and bottom of the screen.

By the way, I didn’t blindly keep using the Sleep feature (although an average consumer probably would) after the first incident.  I switched to Hibernate for months.  Then after Microsoft claimed to have fixed the problem I went back to using Sleep.  And it happened the second time.  Microsoft issued more fixes and now it hasn’t happened in a long time, but how do you think I would have answered the CR survey?

Lest one think that this is all backward looking, I had another experience just this past week.  A friend bought his daughter a Surface Studio.  A month later she said something about having lost her drawings and being unable to re-install the drawing app she was using from the store.  In fact, they couldn’t install any app.  Or run any store app.  So he asked me to take a look.  The system was completely messed up.  Attempts to fix the store failed.  Pretty much nothing store-related would work, and a lot of things in Windows 10 have a connection to the store.  I had to advise them to use Windows 10 Reset to get the system back in a usable state.  How do you think they would answer the CR survey?

I’ll contrast this with my Surface Pro 2, which has worked flawlessly from day one.  Or my original Surface RT, which was fantastically reliable as well.  From my narrow perspective, if you’d surveyed me on early members of the Surface family I’d be able to say they were very reliable.  But I haven’t had that experience with a Surface device in the last 3 years.  Does that mean I’ll avoid Surface devices in the future?  No, I’m very likely to pick up a Surface Laptop.  But I’m going to be brutal on Microsoft if that experience isn’t near perfect.

So before dismissing the CR downgrade consider that the Surface devices have had problems, and personal experience suggests those are not completely behind them.  Microsoft, rather than being dismissive of Consumer Reports’ findings, needs to double down on the quality of Surface devices.  They might also want to take another look at how they collect reliability and customer satisfaction data, because they seem to have missed that the customer experience isn’t nearly as good as their own data shows.

Posted in Computer and Internet, Microsoft | Tagged , | 8 Comments

Does Intentional finally have clear intent?

Reminder: This represents personal views, not those of my employer

Almost 16 years ago, while at Microsoft, I was asked to take a look at the work Charles Simonyi was doing on Intentional Programming (IP) to see if I could find a place to apply it.   After spending some time with Charles I came to the conclusion that while interesting, I couldn’t find a place to apply IP in the near-term. I wasn’t the first to come to that conclusion, the investment in IP was on the chopping block as it hadn’t found a productization home.  Not long after I failed to come up with a way for Microsoft to exploit IP, Charles got Bill and Steve’s blessing to take IP outside and form his own company around it.

For 15 years Charles’  Intentional Software pursued IP, though until today I didn’t know to what end.  It seemed like little more than a pet project of Charles’ that would never bear fruit.  But apparently Intentional Software eventually found a clear intent for IP, a new platform for creating team collaboration applications.  Microsoft, which has shown a strong focus on team collaboration lately (*), was so excited by where Intentional Software was going that it decided to acquire them.

Congratulations to Charles and the Intentional Software team.  Charles has been working on IP for at least 22 years, and it’s nice to see his faith rewarded.  Assuming Microsoft proves out the value of IP by turning Intentional’s collaboration platform into a successful commercial offering, it will be interesting to see where else it can apply IP.

(*) Microsoft has been obsessed with team collaboration since at least the early 90s.  So it is a little disingenuous to point out that it is a recent focus.  Exchange’s Public Folders and Microsoft Access’ multi-master replication were 1990s features intended to support collaboration and collaboration applications.

Posted in Computer and Internet, Microsoft | 2 Comments

The beauty of Amazon RDS Multi-AZ

I’m working on a blog post that tracks advances in high availability from the 1950s until today.  It will be way too long for most to read, but I’ll eventually finish writing it since it amuses me.  In the mean time I came across references that triggered a desire to write about a more narrow topic.  And as a reminder, this is my personal blog and these are my views, not necessarily those of my current or previous employers.

If you have some time take a look at John Devito’s tutorial on creating a Windows Cluster.  John doesn’t even talk about obtaining and maintaining the requisite hardware, but it still takes to Part 4 to get Microsoft SQL Server failover clustering working.  Or take Brent Ozar’s article on setting up SQL Server 2016 AlwaysOn Basic Availability Groups.  Brent also recommends you download his checklist for setting up the SQL Server you will be using as your secondary, take some precautions so it will be compatible with the primary, and apply necessary Windows patches.  John and Brent make this easier by taking what seems like an infinite set of choices and turning them into a recipe.  But it’s still not a recipe you can simply whip up for dinner.  These are but two of many write-ups you can find that demonstrate the difficulty in creating a high-availability solution around the tools provided for Microsoft SQL Server.

Putting a high-availability solution in place for any database engine is difficult and complex.  Oracle is in a class of its own, on both capabilities and the complexity of implementation.  For open source databases there are many options and they all come with differing levels of trade-offs and complexity of implementation depending on the characteristics you are looking for.  Want to implement a highly-available PostgreSQL database?  Here’s a cookbook for you.  Or maybe a packaged consulting offering from EnterpriseDB would help you breakthrough the complexity. Their are a myriad of solutions for MySQL (a 2010 book listed 50 recipes).  There is a more recent book by members of Oracle’s MySQL team covering some of them.  MariaDB and Percona would both love to help you with consulting to set up your high-availability MySQL solution.

With all this complexity you can imagine the pleasant surprise when a couple of years ago I discovered the Amazon RDS Multi-AZ capability.  To setup this high-availability solution takes a single step at either database instance creation time or later via modifying the instance, select Yes (in this case for Amazon RDS for SQL Server) from a drop-down:

mirroring

Of course the implementation of Multi-AZ may be complex, but all of that is hidden from the DBA and other IT staff.  The real work is done by the infrastructure and software that Amazon has created.

When I think back over the years through all the application databases that should have been highly available, but weren’t because of the complexity and cost involved, is when I get the most excited about RDS Multi-AZ. When I tried to reserve a tee-time, and the system was down.  Or place an order on a small specialty store website and see the telltale error message indicating the website can’t talk to the database.  Or be rushing to change my company benefit elections before open enrollment ends, and realize the database is down and no one is around to do anything about it until Monday.  Or look up a book on my Library’s on-line card catalog and realize I was going to have to manually search the stacks instead.

Sure RDS Multi-AZ dramatically brings down the cost and complexity of keeping  obviously mission critical databases running.  But what excites me even more is that it enables all databases to easily be made highly available.

Stay tuned if you want to know why the transistor was the biggest improvement ever in computer system availability, how a number of attempts to improve availability turned out to be so complex they actually reduced availability, the big breakthrough of checkpoint restart, and how ACID saved the world. It will take me a while to wrap that up, but hopefully it will be worth the wait.

Posted in Amazon, Cloud, Computer and Internet, Database, SQL Server | Tagged , , | 4 Comments

Amazon RDS Customers: Update Your SSL Certificates by March 23, 2015

Amazon RDS customers who use SSL to connect to their databases should read this post on the AWS Security Blog.  X.509 certificate expiration may not be a topic database professionals typically worry about, but in this case you need to.

Posted in Computer and Internet