Managed Security Testing

Figma design for task management tool on a yellow background
 
 

Streamlining penetration testing for white hat hackers 🧑‍💻🕷️🔨

How I doubled productivity to increase business growth

Trustwave Managed Detection and Response is an industry-leading enterprise solution that uses exclusive intelligence to track, hunt, and eradicate cybersecurity threats with accuracy.

 
 

My Impact as the Sole Designer

I collaborated with multiple product teams to improve the productivity of the internal security team by 53%, allowing the company to optimize its managed service sales and cut manpower by 45%.

As the sole designer, I helped Trustwave merge five acquired products into a unified managed security platform within a B2B SaaS portal, revamping the user experiences for external customers and internal tools. I owned this project from start to finish, gathering requirements, conducting research, creating presentations to share with stakeholders, and working with the dev team throughout the implementation.

Context of MST app within the security portal, including an early IA diagram

Project Background

As part of this project, I persuaded stakeholders to additionally focus efforts on improving the internally managed services portion of the product to retain top talent.

While collaborating on the larger managed security product vision, I overheard managers discussing complaints from their security testers threatening to leave the company and find work elsewhere. It seemed these users were unhappy with the existing product and I was curious to know more. I started by requesting some usage data and performed some calculations. I discovered internal employees spent more time using the existing product than customers.

Challenges:

  • New Team Collaboration: Coordinated with a largely new team spread across various time zones, following recent acquisitions with diverse operational methods and approaches.

  • Navigating Ambiguity: Established a unified goal and requirements while educating the team on the project's objectives and vision.

  • No Existing Research: Proceeded without the benefit of existing user research.

  • UI Constraints: Adapted the design to work within the limitations of existing UI widgets.

information graphic showing hours used per day per user type

I discovered that internal security associates used the product 1.65x more than customers...

and they were threatening to leave.

 
 

Goals

Focusing on the internal workflow portion of the project scope, we agreed it was worth optimizing the internally managed services operated by the testers while building the new security platform.

Business Goals:

  • Optimize managed service productivity to reduce manpower and realize cost savings

  • Elevate the satisfaction of security testers to retain talent

  • Simplify the onboarding process for new testers to save time

Product Goals:

  • Optimize the portal experience for internal associates performing customer tests

  • Move from the legacy managed security platform into the new security platform we were building

 
 

Getting to Know the Security Testers

Being new to the company, I met with the internal testers to understand why they threatened to leave, uncovering multiple opportunities to improve their workflow.

I wanted to understand why the security testers were threatening to leave so I flew to Texas where most of the in-house security team was located who conducted customer testing. I held a group meeting followed by contextual inquiry where I grasped the team's composition, daily objectives, working environment, and pain points.

Black and white line drawing of flight from nyc to austin, tx

After returning to NYC, I furthered my understanding by conducting screen-sharing interviews with security team members in other states as well as a few of their customers.

 

Observations:

I discovered the user experience greatly hindered their workload progress, costing the company money and upsetting customers due to receiving late reports.

  • Testers experienced difficulty locating information since it was scattered across the platform with inadequate navigation and across external apps as well.

  • The managed services platform lacked clear project status and submission tracking, causing testers to rely on emails and external spreadsheets for communication and status tracking.

  • Job-related content and tools were spread across 12 products.

Logos of all the products used by the security team
  • The security team had even created a custom API to gather and load information into the portal, attempting to streamline their workflow.

  • Workload queues existed in three different areas within the portal:

  1. Legacy project queue (task assignments for testers and reviewers)

  2. Pen test queue (task assignments for testers and reviewers)

  3. QA queue (task assignments for reviewers)

 

Security Tester Requests:

The security team wanted to manage their workload with fewer products and consolidate customer projects into a single view with clearly marked statuses.

  • Clearly show project status in the portal (completed, in progress, and pending tasks)

  • Swiftly locate tasks related to their test environment expertise

  • Manage all tasks from a single unified view

Defining the Scope for Our Users

I created visual artifacts summarizing what I’d learned, from personas to user flows, to share with the product teams which helped us define what to build.

I mapped their 15 internal security roles to five persona archetypes.

Persona archetype list with each type designated in a different color

I showed the scattered portal locations and many products users interacted with to perform their job functions.

 
 

I mapped high-level individual user flows for each archetype. Then I looked for overlap and combined these into a single map for a comprehensive overview of all roles interacting with managed tests, from the customer requesting managed testing services to receiving the completed reports.

I collaborated with the PMs and lead engineers to explore how we might restructure the online portal to design something we could build within our timeline.

 
 

Design Exploration

Moving forward with team alignment, I dove into designing a consolidated work assignments view for the security team via iterative design and feedback.

My first thought was to merge all task assignments for the security team into a single view because the QA reviewers also conducted testing. This would allow viewing and tracking all tasks in one place instead of navigating between separate pages.

I considered dividing the tasks into swimlanes like JIRA.

Then I got inspired by summary bar statistics I saw in other products.

I explored ideas via many iterations of wireframes:

  1. Filter Summary Buttons: Added filter summary buttons for tracking progress and phase-specific filtering.

  2. Actionable Item Emphasis: Highlighted actionable items and those needing immediate attention.

  3. In-App Communication: Introduced a comments column to facilitate team communication within the app.

  4. Task Assignment Management: Enabled direct task assignment management within the app.

  5. Contextual Detail Pane: Incorporated a detail pane for additional context, eliminating the need to navigate away from the page.

I investigated an option to filter the grid based on the logged-in user’s tasks, their team, and all tasks across all teams.

User Feedback

Feedback directed me to evolve the assignment’s view on high-priority tasks for individual users, versus workflow progression steps, helping them meet customer timelines.

Tester feedback paired with design solutions

Dev Updates Allowed for More UI Options

During the project timeline, our dev team updated their UI library to React, allowing for more complex and interactive UIs which I incorporated.

With new data grid options, I moved away from single-line rows and explored grouping grid row content to increase glanceability and comprehension.

I also separated the Completed tasks from Active tasks to help with server performance per feedback from the dev team and users.

This design received great feedback with a unanimous share that the first column in the grid showing a visual tracking of the progress was overkill. Besides addressing that comment, I pushed myself to improve my design and simply the cognitive load for understanding when testers or reviewers were needed on a task.

 
 

The Final Solution

The final workflow and page designs met stakeholders’ needs in everything from usability to performance.

Sample page from project: Active task view for the QA/Reviewer role

Key Benefits on the Active Task Page:

  • Prioritized Task Display: A consolidated view of active tasks with clear statuses, prioritized in the first grid column.

  • Eye-Catching Highlights: Important tasks are marked in an eye-catching orange color for easy identification.

  • Quick Access Summary Bar: A top summary filter bar allows for quick information access and task queue filtering, addressing users’ main concerns and priorities.

  • Easy Task Assignment: User role icons make unassigned tasks easy to spot, with a one-click option to assign tasks to oneself.

  • Flexible Sorting: Tasks can be sorted by scheduled start date or project name.

  • Improved Grid Glanceability: Enhanced grid readability through strategic grouping of information, bold text, status badges, icons, color, and increased line height.

  • High-Level Overview: A detail pane provides high-level summary information, eliminating the need for users to click into the project.

  • Team Management: Options to view assigned team members and assign or unassign tasks based on user provisioning.

The Final Impact

My design solution exceeded business and product goals for creating a unified managed security testing platform, increasing internal productivity and user satisfaction, saving the company money, and ensuring customer reports were delivered on time.

Wins:

  • Increased Productivity: The internal testers and reviewers were thrilled with the improved workflow, and productivity increased by 53%.

  • Reduced Headcount: The company was able to reduce headcount by 45% and save money.

  • Easier Onboarding: The security team managers were grateful to onboard new testers more easily.

  • Enhanced Performance: The product performance was greatly improved with faster load times.


Losses:

  • The company lost some of its most experienced security team members.


Takeaways:

  • This project reinforced the importance of approaching projects holistically and trusting my instincts to explore opportunities beyond the initial scope.

  • Collaboration, user research, and rapid iterative testing proved essential in delivering solutions that not only met, but exceeded both user and business expectations.

 

Where to next?