Open Source Cloud Authors: Peter Silva, Derek Weeks, Elizabeth White, Reinhard Brandstädter, Pat Romanski

Related Topics: Open Source Cloud

Open Source Cloud: Article

The Challenges of Developing Games & Other High-Resolution Graphics

Developing technical and process improvement strategies

Technical Solutions
Standard industry best-practices were able to help game developers prevent and detect many errors that C/C++ developers commonly encounter. Practices such as coding standard enforcement, unit testing, runtime error detection, and regression testing help the developers - ranging from inexperienced to expert - produce better code. With these practices, many errors are prevented, and the errors that were introduced are rooted out as early as possible - when they are fastest, easiest, and cheapest to fix.

Considering the tight timelines that are characteristic of the gaming industry, it is critical that attempts to improve quality do not impact the gaming developers' already busy schedule or disrupt the creativity that is critical to keeping ahead of competitors. That's why it's critical to automate these practices. For minimal disruption, the practices are configured to run behind the scenes - for instance, overnight, as part of an automated build process - then alert developers only if a problem is found. Much testing can be done without user intervention, by scanning the code and executing it with automatically-generated test cases. Of course, the more effort developers put into the testing, the greater benefit they get out of it. For instance, to improve code coverage, developers could analyze the coverage achieved by automatically-generated unit tests, then extend these tests to cover a larger percentage of the code and/or to verify specific functionality requirements.

Automating the implementation of these key industry practices helps the gaming developers overcome many of the challenges associated with producing top-quality software in a high-pressure environment and working on teams that have many inexperienced developers. However, it does not help them with one of their toughest tasks: correctly interacting with graphics libraries.

As I mentioned earlier, there are two main graphics libraries used by game developers and other developers creating high-resolution graphics: OpenGL and DirectX. OpenGL, introduced by SGI in 1992, is the most widely-used and supported interface for realistic 2-D and 3-D high-resolution graphics. It is an open, vendor-neutral graphics standard that works on a wide variety of platforms, including Linux. DirectX is a group of technologies for running and displaying multimedia applications on Microsoft Windows and Xbox. We found that the best way to prevent errors related to library misuse was to customize the coding standards analysis practice and technologies to automatically check OpenGL and DirectX rules as well as the standard C/C++ rules. The basic set of rules, based on the guidelines set forth in the libraries' specifications, includes over 30 rules for avoiding pitfalls with OpenGL and DirectX API development. DirectX rules checked include:

  • Use the macros SUCCEEDED() and FAILED() to check if a DirectX function has failed
  • Include stdafx.h at the beginning of the file
  • Use D3DVALUE instead of float
  • Effects should be closed (End()) in the reverse order in which they were opened (Begin())
  • Use Pass() between Begin() and End()
  • Don't call other functions in the same line as a Begin call
  • Don't call other functions in the same line as an End call
  • Don't change Effect ::Technique between Begin() and End(); call End() first
  • Don't use #include more than 10 times
  • Use an equal number of lock calls and unlock calls
  • Don't use exclusive mode
  • Don't call BeginScene() twice without first calling EndScene()
OpenGL rules checked include:
  • Use the appropriate number and order of "Begin" and "End" functions calls
  • Use the appropriate number and order of "NewList" and "EndList" functions calls
  • Use GL commands between a Begin/End pair
  • Use GL commands between a NewList/EndList pair
  • Don't use designated functions in Begin/End blocks
  • Don't use designated functions in NewList/EndList blocks
  • Don't use designated functions outside Begin/End blocks
  • Don't use an End block without a Begin in each NewList/EndList block
  • Don't use forbidden bracket commands between a Begin/End pair
  • Don't use forbidden bracket commands between a NewList//EndList pair
  • Use only GL functions between every Begin/End block
  • After every Begin(GL_LINES), use the count of "vertex" function calls divided by 2
  • After every Begin(GL_TRIANGLES), use the count of "vertex" function calls divided by 3
  • After every Begin(GL_QUADS), use the count of "vertex" function calls divided by 4
  • After every Begin(GL_POLYGON), use more than 4 vertexes
  • Don't use the function ÔLoadMatrix' to initialize a matrix
  • Don't use the function ÔMultMatrix' to change a matrix
  • Don't use negative vertex and texture coordinates
  • Don't use more than one GL command in a single line
  • Don't use more than five levels of functions calls
This set of rules can be extended with rules that check additional guidelines, such as the guidelines for avoiding Open GL pitfalls described in Mark Kligard's "Avoiding 16 OpenGL Pitfalls (www.opengl.org/resources/features/KilgardTechniques/oglpitfall/), as well as guidelines in the growing set of resources for these libraries. Most gaming developers are aware of these guidelines. However, due to the extreme working conditions and time constraints mentioned earlier, they sometimes make mistakes and they simply don't have the time to verify that the guidelines are followed.

In addition, we provided the game development organization with a technology that allowed them to design and check custom rules that verify compliance with additional guidelines. These custom rules could be used to check additional OpenGL or DirectX guidelines that the team decides are helpful, check guidelines for custom graphics libraries, and check additional requirements unique to their organization, technologies, projects, etc.

Process Improvement Solutions
Truly implementing a practice in a team requires not just the appropriate tools, but also the team culture, workflow, and supporting infrastructure required to embed the practice into the team's development process.Teams that attempt to implement practices with tools alone typically do not achieve the expected quality improvement benefits. For instance, assume that a team tries to implement the coding standards enforcement practice by only purchasing a coding standards enforcement tool and asking each developer to use that tool. Over time, it's likely that most of the coding standard violations will remain in the code. Why? Without additional team-wide support for the coding standards enforcement practice, developers typically become overwhelmed by the number of problems reported and do not know how to approach them. The tool helps the team members recognize the faults in their code, but if the developers do not have the necessary support, the faults remain and the code quality does not significantly improve.

The Parasoft AEP Methodology details one strategy for embedding best-practices into a team's development process. In a nutshell, the AEP Methodology is a new methodology for improving software quality and increasing the efficiency of a team's software development lifecycle. It is based on the AEP Concept, which is essentially to learn from your own mistakes and the mistakes of others, and then automatically apply that knowledge in the software lifecycle to make software work. The basic principles of the AEP methodology are:

  1. Apply industry best-practices to prevent common errors and establish a foundation for full-lifecycle error prevention.
  2. Modify practices as needed to prevent unique errors.
  3. Ensure that each group implements AEP correctly and consistently.
    a. Introduce AEP on a group-by-group basis.
    b. Ensure that each group has an appropriate supporting infrastructure.
    c. Implement a group workflow that ensures error prevention practices are performed appropriately.
  4. Phase in each practice incrementally.
  5. Use statistics to stabilize each process, and then make it capable.
For a detailed discussion of how this methodology works, see http://www.parasoft.com.

By applying the previously mentioned best-practices within the AEP methodology, gaming development organizations gain the following benefits:

  • Higher quality: Fewer errors are introduced into the code, and introduced errors are identified and fixed early in the cycle (when fixing them is generally less difficult, time-consuming, and costly).
  • Fewer schedule slips, faster time to release: Because of the improved error prevention and error detection, less time is required for end-of-cycle debugging, and games are more likely to pass the independent validation in the first attempt.
  • Easier, faster software updates: Team code reflects a standard style, not individual preferences, and meets a predetermined quality standard. When an organization wants to enhance or extend popular game, misunderstandings/buried errors don't impede the process.
  • More proactive management: Increased visibility into code quality, test scope, project readiness, and team productivity helps management identify problems as they arise and start resolving them as soon as possible.
Learning From the Game Development Industry
Even if you're not one of the relatively few developers working on games for Linux, you can still benefit from the lessons we learned by working with gaming organizations.

There is a growing tendency to use Linux for high-resolution graphic development, which shares many of the same challenges as game development. As many industries working on high-resolution graphics are looking to move from SGI systems, they are finding that porting legacy code to Linux is much faster and cost-effective than porting it to Windows. Linux is already emerging as the platform of choice for developing and running graphical work for animation projects (including major projects at DreamWorks, Pixar, and Sony Pictures), special effects, and film production. In addition, Linux is becoming a popular operating system for other high-resolution graphics applications, such as computer aided design, medical devices, and geographical imaging systems.

Developers in these industries face many of the same challenges as developers in the gaming industry. For instance, developers working on high resolution graphics in any of these industries are all too familiar with crazy deadlines, long hours, low tolerance for graphical errors or other errors, and the need to master standard graphics libraries (for Linux, OpenGL) and/or custom graphics libraries.

Consequently, the same technical solutions and process improvement solutions that help developers in the gaming industry can help developers working on other high-resolution graphic projects - or even in other industries with similar pressures and development environments. The automation of practices such as coding standards analysis, unit testing, runtime error detection, and regression testing helps ensure that common coding problems don't delay intense production schedules. To verify compliance with guidelines for standard graphical libraries, guidelines for other technologies that the application interacts with, and unique organizational or project requirements, the coding standards analysis practice can be extended to check custom rules. And, to ensure that these practices become an enduring and seamless part of the development process, they can be implemented within the AEP framework.

More Stories By Wayne Ariola

Wayne Ariola is Vice President of Strategy and Corporate Development at Parasoft, a leading provider of integrated software development management, quality lifecycle management, and dev/test environment management solutions. He leverages customer input and fosters partnerships with industry leaders to ensure that Parasoft solutions continuously evolve to support the ever-changing complexities of real-world business processes and systems. Ariola has more than 15 years of strategic consulting experience within the technology and software development industries. He holds a BA from the University of California at Santa Barbara and an MBA from Indiana University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
We’ve worked with dozens of early adopters across numerous industries and will debunk common misperceptions, which starts with understanding that many of the connected products we’ll use over the next 5 years are already products, they’re just not yet connected. With an IoT product, time-in-market provides much more essential feedback than ever before. Innovation comes from what you do with the data that the connected product provides in order to enhance the customer experience and optimize busi...
In his session at @ThingsExpo, Chris Klein, CEO and Co-founder of Rachio, will discuss next generation communities that are using IoT to create more sustainable, intelligent communities. One example is Sterling Ranch, a 10,000 home development that – with the help of Siemens – will integrate IoT technology into the community to provide residents with energy and water savings as well as intelligent security. Everything from stop lights to sprinkler systems to building infrastructures will run ef...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, will discuss how leveraging the Industrial Interne...
The increasing popularity of the Internet of Things necessitates that our physical and cognitive relationship with wearable technology will change rapidly in the near future. This advent means logging has become a thing of the past. Before, it was on us to track our own data, but now that data is automatically available. What does this mean for mHealth and the "connected" body? In her session at @ThingsExpo, Lisa Calkins, CEO and co-founder of Amadeus Consulting, will discuss the impact of wea...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
Increasing IoT connectivity is forcing enterprises to find elegant solutions to organize and visualize all incoming data from these connected devices with re-configurable dashboard widgets to effectively allow rapid decision-making for everything from immediate actions in tactical situations to strategic analysis and reporting. In his session at 18th Cloud Expo, Shikhir Singh, Senior Developer Relations Manager at Sencha, will discuss how to create HTML5 dashboards that interact with IoT devic...
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
The demand for organizations to expand their infrastructure to multiple IT environments like the cloud, on-premise, mobile, bring your own device (BYOD) and the Internet of Things (IoT) continues to grow. As this hybrid infrastructure increases, the challenge to monitor the security of these systems increases in volume and complexity. In his session at 18th Cloud Expo, Stephen Coty, Chief Security Evangelist at Alert Logic, will show how properly configured and managed security architecture can...
A critical component of any IoT project is the back-end systems that capture data from remote IoT devices and structure it in a way to answer useful questions. Traditional data warehouse and analytical systems are mature technologies that can be used to handle large data sets, but they are not well suited to many IoT-scale products and the need for real-time insights. At Fuze, we have developed a backend platform as part of our mobility-oriented cloud service that uses Big Data-based approache...
SYS-CON Events announced today that Ericsson has been named “Gold Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. Ericsson is a world leader in the rapidly changing environment of communications technology – providing equipment, software and services to enable transformation through mobility. Some 40 percent of global mobile traffic runs through networks we have supplied. More than 1 billion subscribers around the world re...
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm and share the must-have mindsets for removing complexity from the development proc...
trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vice president of product management, IoT solutions at GlobalSign, will teach IoT developers how t...
There is an ever-growing explosion of new devices that are connected to the Internet using “cloud” solutions. This rapid growth is creating a massive new demand for efficient access to data. And it’s not just about connecting to that data anymore. This new demand is bringing new issues and challenges and it is important for companies to scale for the coming growth. And with that scaling comes the need for greater security, gathering and data analysis, storage, connectivity and, of course, the...
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
Digital payments using wearable devices such as smart watches, fitness trackers, and payment wristbands are an increasing area of focus for industry participants, and consumer acceptance from early trials and deployments has encouraged some of the biggest names in technology and banking to continue their push to drive growth in this nascent market. Wearable payment systems may utilize near field communication (NFC), radio frequency identification (RFID), or quick response (QR) codes and barcodes...
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, will explain how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
You deployed your app with the Bluemix PaaS and it's gaining some serious traction, so it's time to make some tweaks. Did you design your application in a way that it can scale in the cloud? Were you even thinking about the cloud when you built the app? If not, chances are your app is going to break. Check out this webcast to learn various techniques for designing applications that will scale successfully in Bluemix, for the confidence you need to take your apps to the next level and beyond.