Open Source Cloud Authors: Liz McMillan, Rostyslav Demush, Jason Bloomberg, Yeshim Deniz, Stackify Blog

Related Topics: @DevOpsSummit, Open Source Cloud, Agile Computing

@DevOpsSummit: Blog Post

Six Requirements for Synthetic User Management | @DevOpsSummit #APM #DevOps

Violate at your own risk

You always want to know that your website is operating at its best, but how do you know that's actually the case? It's not so easy to see behind the curtain when it comes to your web infrastructure. We've long used proxy metrics like CPU load or server availability to ensure that a server is "up," but these measurements don't provide enough data. In fact, as websites become more complex and change more frequently, these measurements become less useful.

A website visit may involve a wide range of components many of which are off-site or not easily monitored. External ad servers, web service APIs, content delivery networks, and even specialized back-end systems - each of these represent a potential bottleneck that could impact an important transaction without raising an appropriate red flag.

So what do you have in your toolbox to help address this? Load testing, real-user monitoring and site instrumentation all help you prepare for and monitor your website visitors' experiences. But one more tool that's essential for a performance engineer is synthetic user monitoring. It's a critical part of a web monitoring strategy, however for many people, it's uncharted territory. So, in this post we want to show you what is required for proper synthetic user monitoring.

What Is Synthetic User Monitoring? How Does It Help?
Simply put, synthetic user monitoring lets you create virtual users that operate externally to your system and mimic real user behavior by running through user paths on your website or application, while you measure the performance of the system. You do this while the application is in use, on a live production system. Why? Because that's how you can see what your users are seeing, without requiring real users to execute those tasks.

Take this example: you have a check-out cart on your site - a high-value transaction, and therefore one that deserves a flawless experience. Not everyone gets to the cart. Most people are browsing the rest of the site. But when people do get there, you want to make sure they have an amazing experience.

If you rely only on measuring that experience only when a user is actually checking out, you have no way of knowing what that experience will be like for that user. You put the high-value transaction in jeopardy because you don't have any data about how well it will perform until there is a real person going through it.

This is exactly what synthetic users are for. You build a simulated transaction that mimics a user's most common tasks: add to cart, checkout, log in, etc. As load increases on the production site and more and more visitors get ready to buy, you can continually check to see what the experience is like along those key tasks without putting any individual visitors at risk. That way, you know about problems your users might encounter before they do.

Six Mandatory Requirements for Proper Synthetic User Monitoring
What should you be looking for in a Synthetic User Monitoring tool? Here are six attributes that should definitely be on your list of key requirements for synthetic user monitoring.

1. Support For Complex Application Scenarios and Advanced Protocols
Synthetic users are great for application transactions that really matter, and these user paths are rarely simple. Your synthetic user monitoring solution should include support for interacting with and navigating through a wide range of web technologies: Flash, HTML5, Google SPDY, Push, WebSocket, and any other of the latest web and mobile technologies.

With this support under your belt, you aren't limited in how you put synthetic users to work. Take a look at your web analytics and find out what your most common paths through the site are. Then recreate those in your synthetic user monitoring tool, exactly as your users experience them.

Beyond that, think about how you can leverage synthetic testing for new features, before you let real users in. Deploy a feature on a special build that's running on the same server. Don't direct live users to it yet, but plan some paths for synthetic users. Then at times of peak load, run the synthetic users through the script to see how the experience is.

There are plenty of other ways of leveraging synthetic user monitoring to be more proactive. By thinking about the future first, you'll use synthetic user monitoring to its maximum benefit. Check out more tips here.

2. No-Code Scripting of Test Scenarios
Once it's set up, synthetic user monitoring is a fantastic tool. What holds many people back is being able to write a script that lays out an entire decision and process tree that a user could make. So you want a tool that makes this as easy and frictionless as possible.

As stated above, you want to create scripts that are modeled after real user behavior. A no-coding solution for script development makes this process significantly easier because you work within a graphical interface, putting blocks of functionality together without the pitfalls and complexities of manually written scripts.

You can also incorporate other attributes of user behavior into your scripts - for example, connection speeds and browser behaviors. You can execute scenarios from various geographies for further realism in your testing.

A no-coding solution for test scripts means you can quickly churn out a robust, representative library of tests that will accurately simulate your users.

3. Shared Scripts Between Synthetic User Monitoring and Load Testing
You gain a lot of efficiencies by reusing your load testing cases in your synthetic user monitoring tool. If you think about, there isn't much difference between what you want to test in a load test and what you want to test in synthetic user monitoring. In both cases, you are looking to leverage realistic test scenarios to see how the system behaves before a real user experiences a problem.

So repackage your load tests as synthetic user monitoring tests, and look for a tool that allows you to share them between these different testing environments. You'll want to be able to easily port your load tests into synthetic user monitoring tests, and you may even find that a new synthetic user monitoring scenario would make a really good structured load test. You can often use the same data too, which is a great way to test in production and test in the cloud without putting data at risk.

Your mom always told you to recycle. Here's just another way to do that!

4. Realistic Network Emulation
A good synthetic user test will simulate a real user as accurately as possible, and one key characteristic of that experience is the network. Not everyone connects to the Internet with the same high-quality connection. You'll want a synthetic user monitoring tool that emulates various network speeds (3G, 4G, Wi-Fi) as well as network errors like packet loss and latency.

When everything works smoothly, users are likely to have a good experience. But things don't always go smoothly - that's when errors occur and users complain. How does your application perform in the face of these errors? That's a key question you'll want to ask and one of the ways you can leverage a modern synthetic user monitoring tool.

Introduce errors into your test scenarios to see how your app behaves under stress. If there is a network error along the way, do client apps suddenly start drawing down lots of data as part of a re-sync protocol? What happens when this takes place at scale? The data you collect through your synthetic user monitoring tool has a tremendous amount of value and can help improve the system - and the user experience - in many ways.

5. Emulation of Mobile Devices
If you haven't gotten the memo yet, web users are mobile. You should no longer be thinking about these as two separate environments or even two separate user bases. Today, the rule is "mobile first." So you need to be monitoring both your mobile users and your desktop users as a common set of visitors.

Your synthetic user monitoring tool should have the ability to emulate a wide range of mobile devices so you can determine how those users may be experiencing your website, and particularly if there are any differences between what someone sees on their phone as compared to their computer.

Be sure to consider mobile load testing and monitoring right from the start, when setting up your initial synthetic monitoring and tests. Leverage your analytics data to find out how many users are on mobile and what they are doing. Don't treat this as secondary - today's web users are on their devices, maybe even more so than their computers.

Are you still trying to figure out if a dedicated mobile testing environment is important? Check out our infographic, Mobile-First Performance - it may persuade you.

6. Real-Time Dashboards and Notifications
The synthetic monitoring system creates simulated users within a fully-controllable browser, so the testing system has complete access to all the data inside the browser (unlike real user monitoring, which happens inside a sandboxed javascript instance). The detail that can be garnered from this is staggering, including full waterfall charts, resource-by-resource performance and screenshots/videos of the pageload in action to determine paint times.

Make sure your synthetic user monitoring solution takes advantage of all the information available and makes it accessible through a rich set of dashboards and real-time notifications. You should have access to real-time and historical data, along with the ability to set and monitor key performance indicators (KPIs). You'll also want to configure alerts so your monitoring team can take action when SLAs are violated.

This is a critical requirement, as it turns your synthetic user monitoring tool from a learning system to a doing system. Regular synthetic tests can monitor performance and immediately alert staff to fix a problem before a user experiences it.

Stay Sensible with NeoSense
You can meet these requirements - and then some - with NeoSense. With this monitoring system on your side, you'll be able to work with complex business applications and simulate the most complicated of user paths. Its fast, it's powerful and it can assimilate with the newest technology. Get more information about NeoSense here!

More Stories By Tim Hinds

Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices.

Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.

@ThingsExpo Stories
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPO also offers s...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
The IoT Will Grow: In what might be the most obvious prediction of the decade, the IoT will continue to expand next year, with more and more devices coming online every single day. What isn’t so obvious about this prediction: where that growth will occur. The retail, healthcare, and industrial/supply chain industries will likely see the greatest growth. Forrester Research has predicted the IoT will become “the backbone” of customer value as it continues to grow. It is no surprise that retail is ...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
DXWorldEXPO LLC announced today that "Miami Blockchain Event by FinTechEXPO" has announced that its Call for Papers is now open. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expe...