Cao Wensheng of Tencent WeTest/Perfdog: Development and Performance Test Are Equally Importa

原创 精选
Techplur
In this article, we invited Mr. Cao Wensheng (Awen), senior testing director of Tencent IEGG, performance testing expert for numerous highly reputed games, and founder of PerfDog, to share his experie

The popularity of a game is determined not solely by the number of daily active users but also by whether a game provides the best possible experience for its players. When players enjoy a game, they are sometimes curious to learn more about its developer - but rarely do they understand how the exceptional performance is achieved.

It takes careful planning and development and testing for performance and quality to develop a successful game. Outsiders may perceive this process as merely the testers "playing" the game, when in reality, there are a number of complex factors involved:

What is the procedure for conducting different performance tests? What methods and tools are employed?

In different game scenarios, what are the technical problems encountered during tests?

How can testing be made more efficient and professional? How do testers collaborate with developers and product managers?

How can we ensure that the game launched is high quality and that the user experience is pleasant?

To address these questions, we invited Mr. Cao Wensheng (Awen), senior testing director of Tencent IEGG, performance testing expert for numerous highly reputed games, and founder of PerfDog, to share his experience and expertise in the field of game performance testing.

PerfDog is a full mobile platform performance test and analysis tool launched by WeTest, a one-stop testing service platform for game developers powered by Tencent. PerfDog quickly locates and analyzes performance issues without requiring additional settings in mobile hardware, games, or applications. With its extreme simplicity and plug-and-play capability, this tool has been recognized and praised by many game testers and developers.


1 . Game performance is largely influenced by stuttering and memory

It has become increasingly popular for the gaming community, particularly the mobile gaming community, to discuss performance in recent years. In a player's eyes, performance is just as important as how a game is played. A tactical game, for example, relies on the picture quality and the interaction between players to be successful.

For testers, however, the technical issue of game performance is complex. The performance of a game can be influenced by various factors, including hardware, operating system, and GPU rendering, among others. With years of experience in game projects, Cao believes that the performance of a game primarily relates to stuttering and memory.


  1. Stuttering

In today's mobile world, the hardware performance of cell phones is constantly improving, and users' expectations are ever-growing. The increasing demand for high-performance games makes gamers less tolerant of stuttering. Stuttering of 50 or 60 milliseconds was tolerated in the past, but today even a slight stuttering less than that is difficult to accept.

Additionally, performance requirements in game development are on the rise. Previously, it is believed that a game reaching 30 fps was good, but now it may be necessary to reach 60 fps to be considered competitive.

Thus, stuttering has become the main performance issue faced by game developers.

However, stuttering seems to be a random phenomenon in game performance tests. There may be no stuttering in the first test but in the second test. Therefore, to resolve this issue, it is necessary to record the live environment of stuttering or performance. Performance testers should utilize professional, convenient, and fast tools to measure stuttering repeatedly and establish more accurate indicators of stuttering through statistical algorithms so that the causes and occurrence of stuttering can be identified more precisely.

Through the metric system, the power consumption of each frame can be better evaluated. In this way, performance testers can provide developers with more targeted guidance for optimizing power consumption to address the stuttering problem.


  1. Memory

The nature of games differs from other types of programs in that games require a large number of images, textures, scene models, and Shaders (a technology dedicated to rendering 3D graphics), which consume large amounts of memory.

An insufficient amount of memory will prevent the mobile phone from running the game, or cause the game to crash even if it does. Performance testers, therefore, need to utilize various memory tools, such as the built-in memory of the game engine, to pinpoint the objects or modules that occupy too much memory and, thereby, resolve the problem (for example, you can prevent certain parts from loading too many images).

Nevertheless, memory problems are not as grave as stuttering because stuttering may occur on any platform, even in flagship smartphones. Memory problems, however, tend to occur in low-end phones or devices with limited memory.


2. Using tools effectively and learning to review work

Identifying and fixing problems are the most critical steps in game testing. According to Cao, performance testers should master various testing techniques and tools in order to be proficient in this field. Meanwhile, it is vital for them to summarize the experience of previous projects and to learn from it.


  1. No universal tool exists

Cao believes that performance testers should understand that different tools have very different capabilities.

Generally, game performance testing tools can be divided into two categories: embedded SDK-integrated source code tools and non-embedded independent tools. SDK-integrated source code tools could collect more comprehensive information but require high thresholds for use (These tools have high demands on the development team and projects). Independent tools are plug-and-play and have a low threshold for use, but their performance indicators obtained may be less extensive.

Apart from this, major engines, IDEs, and hardware manufacturers may have their own performance analyzers, and testers should be aware of the requirements for these tools. For example, when testers use some specific tools, they may find the compiled version of the game is requested to be the development version and not the regular release version. For example, hardware vendors such as Qualcomm and ARM develop tools that may not apply to other hardware, a limitation that testers must be aware of.

Therefore, testers should use various tools when testing, rather than just one specific type.


  1. Intensive study and passion are vital

According to Cao, performance testers should utilize their prior project experience to efficiently solve problems arising during the testing process.

First, it is essential to learn by doing. It is unrealistic to expect a tester to be capable of mastering all the techniques at the beginning. Therefore, performance testers should possess a broad understanding and an open mind of performance, network, compatibility, and pressure testing. While doing so, they may spend time and effort studying the field thoroughly in accordance with their interests or competence so that they can build a firm foundation in that area.

Alternatively, suppose you encounter an issue in the project; in that case, you should study it thoroughly so that you will have a more thorough memory and your abilities will be improved more rapidly.

Specifically, Cao discussed his experience handling compatibility testing for PC games in the past.

An example of incompatibility could be that a game has a blurred or black screen or displays abnormal results on a computer. During this work, Cao discovered some graphics cards would have display problems.

When Cao first encountered this problem, he was unaware of the cause, nor did he know what type of graphics card would cause it. After massive tests, analysis, and summaries of graphics cards, he discovered that this problem was caused by the fact that these games included some 3D Caps that are not supported by some graphics cards.

In response, he developed Bench3D, a virtual graphics card compatibility testing tool that enables him to quickly determine which graphics cards may be problematic and which should be fine. Through this experience, Cao gained a deeper understanding of the underlying technology of compatibility, which he uses to improve the efficiency and credibility of performance tests.

In addition to the above qualities, Cao mentioned that excellent performance testers should be passionate about games.

The nature of testing games differs from that of ordinary apps since testing apps can go through all testing cases quickly. However, there may be millions of ways to play games, resulting in millions of paths of operation, each with its own set of problems. In this instance, the tester must be familiar with the game, interested in it, and passionate about it.

A performance test has greater requirements concerning testing techniques, such as dealing with a poor network, engines, and security conditions than a general application test. Those who do not have enough love for games will have difficulty coping with these challenges.


3. Teamwork is essential


  1. Identifying a valid demand

In general, the output of the performance testing team is intended to be used by the internal R&D staff of the company. It should be a standard procedure for the performance test and development team to transport new work results to all the R&D teams within the company. However, many performance test and development teams may experience a problem—each team develops a tool that is suitable for its own project team only, which may be ineffective when used by more project teams and individuals.

According to Cao, the root cause of this problem is the testing team's tendency to treat the tool they developed as a small one rather than a big tool or even a product. A small tool has a limited range of functionality and can only meet its own needs at the present time.

It may only be realized that there are thousands of needs to be met when the tool is transformed into a larger one or even upgraded into a product. Thinking calmly in that situation is essential, as one cannot meet thousands of needs with just one tool. It is then necessary to distinguish which demands are valid, classify them, and translate them into general requests.

When determining which demands are valid, testers must first identify how many project teams and users have such demands; then they must consider the possibility of executing and realizing them.


  1. Testers must provide honest feedback

In addition to technical aspects of testing, testers may also need to communicate with QA, project managers, product managers, developers, and others.

According to Cao, testers have the most significant insight into the progress of the project and the quality situation, so they can also provide the very best feedback regarding the project. When communicating with these individuals, it is possible for the testers to be on the opposite side of the team from the developers. As testers often have to find bugs in the module design of new project features, there could be some tension between them and developers, which requires improved communication between the two parties.

Additionally, testers should provide feedback to the project leader as well as to the developers. The project leader needs to be aware of the progress of the whole project; therefore, the testers must be able to speak honestly about the actual progress of the project development and the situation concerning quality.

Occasionally, testers have to assume a more significant role within the entire game development team. In Tencent, quality determines whether a game is allowed to go online, and test engineers hold this gate-keeping authority. A game cannot be launched if the testers deem it to be of inadequate quality.


4. Game testing is still far from being fully automated

Even though some traditional app testing has been automated, experiments with games have been less successful, according to Cao.

As of now, automated testing only works when the game scenario is relatively simple or only a single skill is released.

In Cao's estimation, automated testing covers only about 15% of game scenarios at present. There are two main reasons, according to him, for the relatively low use of automation in game testing:

First of all, the real-time rendering and translucent particle effect of the game will make automating images challenging. Meanwhile, game events and system event responses are implemented independently, making automation difficult.

In addition, the game version is frequently iterating, and there could be significant updates even on a daily basis, resulting in rapid changes to interfaces. Consequently, automation is also difficult to implement.

AI might work in ordinary app testing because the interface changes in common apps are less in number and are more consistent, regardless of how the app operates. Regarding game testing, AI is difficult to match up with the scenario. When one or a few pixels of the game are displayed incorrectly, the AI may have difficulty finding it.

Moreover, the performance will also impact the application of artificial intelligence. The player sees something completely different when the viewpoint changes within the game, and AI may not detect such subtle nuances.

Therefore, testers should understand what can be automated and what must still be performed manually, not putting all their hopes in automation.

In the game industry, performance testing ensures players can concentrate on the gameplay and operation rather than worrying about occasional performance issues.

Although the work of game testers may not be widely known, the inner satisfaction gained from work will greatly reward them.


Guest Introduction

Mr. Cao Wensheng (Awen) is the senior testing director of Tencent IEGG, chairman of Testing Summit China (MTSC), and founder of PerfDog. His experience includes the development of game engines and numerous highly reputed games, as well as the efficiency and technology testing of various performance tools and platforms. Currently, he is responsible for building tools and platforms for quality performance in Tencent games, focusing on full platform performance testing.

责任编辑:庞桂玉 来源: 51CTO
相关推荐

2022-08-31 09:13:53

NLPAITencent AI

2022-06-22 11:31:14

腾讯WeTest开发者

2021-07-30 05:51:53

移动应用PerfDog工具

2022-06-21 14:10:30

腾讯

2022-11-02 09:30:00

腾讯WeTest

2009-05-26 12:13:24

test

2009-05-26 12:14:34

test

2009-05-26 12:13:24

test

2022-08-31 15:16:33

QR Codesecurity

2022-08-31 16:38:34

AISummitAI

2011-11-21 17:13:37

服务器日志Tencen

2011-06-01 09:46:16

BlackBerrySDK黑莓

2023-06-28 08:48:51

2013-07-17 15:35:18

HTML5Intel HTML5

2021-01-14 12:16:50

开源技术 软件

2016-05-27 18:00:15

asdf

2009-12-02 17:12:33

ASP.NET Dev

2010-07-19 09:39:53

SQL Server

2021-09-28 10:59:53

MYSQLPerformance 内存管理

2021-05-11 10:03:06

性能优化工具Performance
点赞
收藏

51CTO技术栈公众号