Linux Foundation logo We are joining forces with the Linux Foundation, find out more

FinOps Rate Series: GCP CUD & Big Query Member Call Session

J.R. Storment
J.R. Storment in FinOps Foundation
13th October 2020

Over 180 community members joined us this month for the second part of our FinOps Rate Series, where we cover GCP CUD and BigQuery flat-rate discount optimization.

There’s a lot of complexity when you get into discounts and savings rates across all the cloud platforms. To demystify at least some of the GCP ones, we invited experts from GCP, OpenX, and Spotify to cover foundational understanding around CUDs (committed use discounts), SUDs (sustained use discounts), and BigQuery pricing.

If you want to watch the full recording, join the community and check out the recording.

Rate optimization options from Google Cloud Platform

GCP’s Pathik Sharma focuses on proactively guiding enterprise customers to operate effectively and efficiently in the cloud.

He kicked the session off with a quote that should resonate with FinOps practitioners: “The better optics you have, the more equity you can pinpoint.”

He explained how many rate optimization exercises can be quick wins, while others are transformative, long-term initiatives. As for CUDs, SUDs, and BigQuery reservations, understanding and better utilizing them are quick wins any GCP user can utilize.

Talking CUDs and SUDs

Pathik reviewed some basics on CUDs, SUDs, and PVMs. He recommended monitoring CUDs and analyzing their performance in the GCP self-serve console. You can create cost breakdowns between CUD and SUDs from this view.

Pathik also wants users to consider Preemptible VMs (PVMs), similar to Spot Instances on AWS. Very affordable (80% dsic), ideal for batch jobs and fault-tolerant workloads, average preemption rate can vary between 5-15% per day per project.

Cost optimization considerations for GCP Big Query

GCP’s BigQuery is a serverless, highly scalable, and cost-effective multi-cloud data warehouse designed for business agility. Its pricing and performance is different than running databases on typical VMs. You can use BigQuery on-demand, but pay a much higher rate. Ideally, users want to use flat-rate discounts for the best cost efficiency.

Pathik recommends solving for how your cluster load behaves. It can be a combination of different reservations/commitments. If you need help, you can use Active Assist for rate optimization recommendations. Helps with committed use discounts to help identify bigquery slow reservations. The tool is in its alpha stage, so be aware of that before creating actionable recommendations in GCP.

OpenX

Scott Snell, Cost Analyst at OpenX, walked us through their journey of optimizing GCP. Their initial challenge: migrating 15 million lines of code from six data centers worldwide in six months. Their team immediately saw a contrast in how to optimize cloud finance and solved this challenge by focusing on building strong governance, cost visibility, and cost optimization policies.

In their original cloud model: 70% preemptible, 30% committed use discount, six regions (three U.S., two Asia, one EMEA). This used a moderate amount of on-demand services. This blend changed after migration as OpenX learned many ways to better leverage rate optimizations.

OpenX and its FinOps team prioritize cloud finance opportunities and initiatives by level of effort with cost optimization benefits. From the graph you can see that flat-rate BigQuery and CUDS have the highest cost optimization benefits with the lowest effort.

Diving into rate optimization, Scott identified their best practices around CUD and BigQuery flat-rate optimization. See how their post-migration cloud utilization changed, causing them to have to reconsider their blend of CUD and BigQuery flat rate utilization. Also see the benefits of how OpenX utilized account-level CUD over project-level CUD.

Spotify: what they’ve learned at a massive FinOps scale

Scott Meyer and Brendan Greenley joined the group to talk about FinOps insights they’ve learned at Spotify as it utilizes GCP at a massive scale. It’s refreshing to see their perspective as they are at a massive scale of utilization and are in the step of automating their cost optimization.

Cost optimization is part of their “doing it responsibly” mission. Their leadership recognizes the importance of FinOps and cost optimization, aligning it with a positive impact on gross margin. It’s  important to Spotify as a cloud-native company to pay attention to these growing costs.

The duo talked about how organic growth throws complexity into this process and feeds into their “CUD problem.” Using a laddering system of CUDs helps them find the minimal cost. Investigating migrating to more efficient services is also a way to reduce costs.

Part of their strategy is to combine one-year, three-year CUDs, and SUDs to find the right balance. Taking in risk and business logic helps adjust utilization to make the right decisions. This risk can include the introduction of newer technologies and offerings by GCP.

Seasonality of jobs help work out CUD strategy as well. Their FinOps teams track daily, monthly, annual seasonality of workload. This causes Spotify to always be purchasing CUDs (monthly!). Prioritizing CUD utilization over SUD. Due to sustained growth, a “CUD ladder” makes the most sense.

With a custom tooling, Spotify graphs their CUD and SUD with color coded charts and  watermarks to help identify how efficiently their utilizing savings rates. This process across different services assists their FinOps teams with calculating, monitoring, and forecasting how they utilize CUDs and SUDs.

Watch the video for a live demo of this custom tooling and a look at how Spotify constructs their unit economics to track efficiency. Also check out their open-source cloud infrastructure management platform, backstage.io.

Catch up on the recording and join the conversation

For current members, watch the full recording here. If you aren’t a member yet, sign up to join the FinOps Foundation. Our global community meets many times a month and is always diving into cloud finance topics together.

Keep on breaking down those silos!