If McKinsey has done their due diligence, the global insurance industry is going to look very different by 2030. By their estimates, the continued introduction of new technology like the internet of things (IoT) and artificial intelligence will radically change the way that most insurance providers do business—paving the way for smart, automated workflows that reduce much of the need for paperwork and manual interventions. As a result of these changes, McKinsey estimates that fully 25% of positions in the industry could be automated or consolidated by 2025, and that by 2030 the number of personnel associated with claims in particular could be reduced by more than 70%.
Today, the insurance industry is in the midst of a digital transformation. Sure, there are gradations from one insurance provider to the next in terms of how far along they are and how they envision the future of the industry—but the general trend is that the world of pens and paper needs to give way to connected, intelligent workflows that can generate, validate, and pay out claims digitally. The result of this impulse is already being felt by end users—who are already more likely than they were a few years ago to make use of an app when interfacing with their insurers—but it’s being felt just as acutely by internal staff at insurance companies. After all, they need solid UX in order to do their jobs quickly and efficiently.
As recently as a few years ago, the idea of a smart home—in which all of your appliances and other sensors around your home are networked together digitally—still seemed more like science fiction than a fact of life. And yet, today you can walk into many new homes and use your smartphone to control the temperature and the lighting, you can preheat the oven remotely, and you can get alerts to your mobile device if your smoke detector or burglar alarm goes off. It’s the type of home that technologists have dreamed of for decades.
Let’s imagine that you’re a trendy new startup. You’ve got a new widget that lots of people are downloading that helps that track their runs, or manage their time more effectively, or connect with other members of their community. Sure, there are the usual set of information security concerns, and you have plenty of functionality to build out over time, but the occasional bug or service outage isn’t going to be the end of the world. While high quality testing is still mission critical, it might not feel like a life and death situation.
When technology changes and evolves—as it does almost constantly—in the telecom domain, it typically takes standard-setting bodies like 3GPP six months to a year to establish a new set of test cases for conformance testers. Once those test cases come out, there’s a flurry of activity while operators, device manufacturers, OEMs, and others attempt to verify compliance and interoperability with new and existing standards. The fact that it takes 3GPP a fairly long stretch of time does very little to lessen the time pressure that testers usually face when it comes to performing each new round of service verification.
According to a recent GSMA study, the IoT market will be worth $1.1 trillion and include about 25 billion IoT connections by 2025. The majority of those connections will be in the industrial and vertical industry segments (13.8 billion connections) and the smart home market (11.4 billion).
Let’s say you’re a telco operator pushing out a change to your billing platform. For many in the business world, the hope for a project like this is that the team behind it has a certain level of agility, meaning that they’re a cross-functional group that’s empowered to solve problems in a flexible manner within the company’s larger mission. Unfortunately, agility usually isn’t what we find in cases like these. Instead, we find “waterfall” projects where teams are constantly waiting for approval, wading through red tape, and carrying out pre-agreed plans even as potential challenges and hurdles come to light.
In 1877, Alexander Graham Bell demonstrated the possibility of making long distance telephone calls by calling the offices of The Boston Globe. As you can imagine, the press had a field day—and it’s easy to understand why. From the perspective of technological progress, transmitting voice communication over a trunk line successfully was a feat that would have been scarcely imaginable even a few decades before, and the research and experimentation that led to it would have been extremely complex. On the other hand, service verification would have been a breeze. Since the ceiling for voice quality back then was quite low, it was still a simple matter of placing a call and hoping that it went through successfully. If so: service verified.
For many telco operators, testing can seem like an onerous requirement. It’s often costly and time consuming, and as telecom networks grow more complex and customer use cases and devices become increasingly fragmented, verifying service with any level of confidence is harder than ever. Because of this high degree of complexity, testers need to achieve higher test coverage than ever before in order to maintain network quality—resulting in the relatively widespread adoption of end-to-end testing among those in the industry. Rather than testing voice protocols and Wi-Fi connectivity from a handful of user devices, testers are walking through entire systems and subsystems in the ways that users are likely to do.
Network demands have grown increasingly complex in recent years. For example, people's reliance on smartphones for everything from navigation to mobile banking means that networks must be more robust and secure than ever. Meanwhile, the proliferation of IoT-enabled devices has introduced new protocols and device configurations.