April 12, 2022
April 12, 2022

Transitioning Scopely’s 5.5 PB Data Platform to the Modern Data Stack

Listen to this article

Listen to this article

Should data engineering AND BI be handled by the same people? According to Jonathan Palmer, VP Data Platform at Scopely – YES. By Analytics Engineers.

His team of Analytics Engineers is in the final stages of transitioning 5.5 PBs of data which include 15B events per day to the modern data stack. Tune in to learn how they did it.

Listen on Apple Podcasts or Spotify

Boaz: Hello, everybody. Welcome to another episode of the Data Engineering Show. I am Boaz. I am here alone today without Eldad. But with me is Jonathan Palmer. Hi, Jonathan, how are you?

Jonathan: Hey, I am good. How are you doing?

Boaz: Very good. Jonathan is the VP of the data platform at Scopely. Scopely, if you do not know, is a gaming company, a mobile-first gaming company. They did games like the Walking Dead and Scrabble Go, Looney Tunes: World of Mayhem, and many others. Prior to that, Jonathan was head of BI at GoCardless. Before that, he also spent more years in gaming at King. And in his background, combines both business intelligence and data engineering.

Jonathan, what did I miss about your background that is important to mention, or did I get it right?

Jonathan: I think you got it pretty much spot on.

Boaz: Awesome. So, tell us, what do you do Scopely?

Jonathan: I work for a part of Scopely called Playgami. Playgami is the platform that powers our games, including their analytics capabilities, but enables them to do everything from building the game provisioning infrastructure to experimentation to CRM, push messaging campaigns, and everything in-between, most of which are powered by data in one form or another. So, it is my job to kind of manage the strategy for how that data platform scales and grows and powers all the games, the tech vendors we work with, the people we hire, the products, ideas that we have from one way or the other, that is my job to make sure that all kind of works together.

Boaz: We say platform in the context of what you do, essentially the data platform or the entire platform on top of which the games run?

Jonathan: Yes, my bit is the data platform but sits within the entire platform on which the games run. 

Boaz: You mentioned, we see behind you in the background the Playgami. So, tell us a little bit about that. Are you guys positioning this as something to become a publicly for-out, about kind of name for the data platform? What is the story of Playgami under Scopely? 

Jonathan: Playgami is an internal brand at this point, but as Scopely works with, we have our organic titles, once that was built by Scopely studios, but we also publish games, third parties

and we also have kind of acquired a few businesses over recent years, like, FoxNext with Marvel StrikeForce and GSN games just recently. So, Playgami in some ways is part of the offering and the value proposition of Scopely. It is a super cool platform that enables you to manage the full life cycle of your game and data is kind of the lifeblood of that. 

Boaz: I would love to spend more time on that. Before that, walk us a little bit through your background, you did an interesting journey. You started from more on the business intelligence side, then touched many things, ended up now being sort of VP of the platform. Walk us through your journey.

Jonathan: I had a strange start, so I studied ancient history. So, I had a kind of zero tech background. 

Boaz: So you studied Hadoop. 

Jonathan: No Microsoft Access. Yeah, I got my break in tech from a company called Spark Data in Bristol, UK, which deliberately employed people from non-tech backgrounds and taught them how to be software engineers, and that is working on data-driven systems. That is where I learned SQL and over a variety of years, I got closer and closer to the data side. And then, the kind of inflection point I guess was when I joined King, as you mentioned, then I was a clickfree developer. But, I became more and more interested in the life cycle of data, how it works in enormous complex, fast-paced businesses, like in gaming, and that took me into some kind of roles like a principal engineer and product manager and then Director of data analytics side. More and more thinking about the strategy and how all of this tech fits together with and ultimately what the business is trying to do. 

Boaz: That is amazing. You do not meet many data/historians. 

Jonathan: No, there were not many jobs in ancient Rome to go to. So, I went down on a different path.

Boaz:  Yeah. I know it is the kind of thing you hear about companies or people trying to encourage each other. Let us give a shot to people who do not come directly from tech or from engineering education to take on these positions. But, the truth of the matter is that it remains very rare still. So, very interesting to hear, and it has worked amazingly well for you. I wish we would see that more often.

Jonathan: Yeah, it is an interesting model. It definitely gives a high level of imposter syndrome, I can tell you that.

Boaz: I am sure it does. At Scopely let us dive into the platform which is super interesting. Before that, how many employees does Scopely have worldwide and how big are the data-related teams? 

Jonathan: Scopely is growing super fast and by the time I say a number it is probably already out of date, but last time I looked it was like about 1700 people in 17 markets all over the UK, everywhere from LA to Dublin, London, and Barcelona where I am. And, the data part of that I would say is roughly between 60 and 70 people in data-related roles that cover a data infrastructure team, data applications, core BI, data science, embed, mini data teams or product analysts and analytics engineers into games and verticals as well. So, across all of that organization, it is about 60 to 70 people.

Boaz: And at Playgami as the platform you described, is that the new initiative, was that there from the outset.

Jonathan: Every game requires a platform to run on it. So, the platform itself is I guess is as old as Scopely, but over the last couple of years we have really refined how we think about that platform and increasingly sort of delivering more and more competitive differentiating features and so on, and it is becoming a stronger and stronger brand within Scopely itself.

Boaz: So, when did the Playgami brand sort of launch 

Jonathan: I think in the last 12 to 18 months, I would say. 

Boaz: So, what does the data stack look like? 

Jonathan: We are in the final stages of moving from a legacy stack to our current stack. I mean future proof stack, so to speak, is principally BigQuery, DBT, ELT with Airflow orchestrating DBT jobs, and then the main kind of user touchpoint is Looker. But, we ingest data originally as Parquet files into S3 on AWS and then ship everything across on a kind of micro-batch basis over to BigQuery.

Boaz: What does the legacy stack look like?

Jonathan: Before we made this change, the data ingested, the raw events are ingested into S3 and then, we used EMR and we kind of managed Spark and then parlor ourselves on AWS infrastructure, for ETL and also used as querying the data directly and then the principal means of access was Tableau. 

Boaz: Okay. So, Tableau or the old stack to the current new one. 

Jonathan: Yes. Yeah, exactly.

Boaz: What were the tipping points? Can you describe that? The moment in time when the company decided we have to start looking or build out a new stack because this one does not cut it anymore. 

Jonathan: So, I joined Scopely a couple of years ago. And, when I joined, one of the first things I did was to kind of look around and see, okay, how is everything scaling both in terms of the data warehouse, the computational workloads, also I used to experience and the team had done a super good job both running the Spark and Impala set up, but we were seeing a fairly growing number of outages and one of my Northstars, especially as we already made the decision, the Scopely made the decision before I joined to move towards Looker, average career duration in Looker is one of my kind of Northstar metrics. And, when we started off, it was around five minutes, using our spark infrastructure, which is clearly not what the user wants, and moving to BigQuery, we have now got that to 30 seconds and in terms of the actual sort of distribution of that, like most of our queries are in single figures. 

Boaz: You guys have been in the AWS shop, I am sure making the decision to go into the GCP direction was not an easy one. Tell us a little bit about that decision process there, and then decide to go for it.

Jonathan: Yeah. I do not think it was too controversial, I think Scopely had ambitions for a while to kind of consider a multi-cloud approach, and this was one of the more obvious opportunities to do that. I had worked with the Google stack in King and GoCardless. So, I already rated it pretty highly. But, fundamentally, we went through a sort of a super rigorous market search. You know, we looked at all of the big players, Databricks, Snowflake, and Redshift. We put them all through their paces based on our particular use case. At the end of that evaluation process, there was a clear winner across a variety of dimensions, not just speed, but scalability, cost efficiency, a big one for me is a barrier to entry. So, Spark is super powerful, but it is quite hard to hire people who straight away get Spark and can be effective in that world whereas the barrier to entry, I think BigQuery is a lot lower.

Boaz: The intention though is to stay multi-cloud and keep using both.

Jonathan: Yeah. I mean, Scopely leverages AWS for all sorts of parts of the gaming side of the infrastructure and does it extremely well. So, at the moment, I think we have got a good balance between using the strengths of the various platforms in the right place. 

Boaz: Another interesting thing with the migration, is they move from Tableau to Looker. Oftentimes, we see people, not that they have anything against moving out of Tableau, but the sheer amount of reports that already exist makes it tough for people to agree on moving to a new tool. How did you go about that? Is there a process of recreating or are you saying new ones in the new platform, old one, staying Tableau? 

Jonathan: It is funny. I think for me it was non-controversial because the sheer number of Tableau reports was one of the main reasons to move. I think we deprecated somewhere between 80% and 90% of all the Tableau dashboards created in Scopely's history as part of the move to Looker. There was a huge amount of debt there. That was not being used. Basically creates confusion, that sort of single source of truth becomes impossible. So, it was a very compelling reason to tear it up and start again. 

Boaz: It is like moving apartments, you start, going drawer by drawer. 

Jonathan: Exactly. It is the housekeeping.

Boaz: The trash bin there and then what goes into the trash and what stays in the drawers. 

Jonathan: Exactly, everybody talks about spring cleaning, but rarely gets round to doing it. So, it is the best opportunity to do that. 

Boaz: It takes courage too. At the end of the day, deprecating 80%, 90%, many companies, it takes courage to some extent. Even though it might be the completely rational thing to do, and the correct thing to do, I feel many companies or maybe we get emotionally attached to so many work hours behind those dashboards. Throwing them out oftentimes makes people feel what do I do it for?

Jonathan: That is true. I would say it is the courage you could see but in our case, we had a relatively new team and I think in a pretty clear sense that the way we were operating was not going to scale for a Scopely that is 10X, what it was then. So, we had to do something and the value of Looker in that was really clear to the team, I think.

Boaz: From a skill set perspective, how did you go about that? I mean, did you guys need to hire a lot of new people, or were you able to use the same people who were in charge of the existing stack, take over and also move to the new one happily. 

Jonathan:  We have been hiring rapidly anyway, so it is a kind of mixture of both. But, all the people who are working on that old stack, transition pretty seamlessly and impressively to the new stack. I put a lot of time and effort personally in that beginning stage of the sort of evangelization, education piece so that when people started working it, they could operate pretty autonomously. Because just having me as the only person who knows how to look in a large organization, is not very scalable. So, I distributed that knowledge super fast, but also getting that sort of buy-in and now feeling like, okay, this is actually going to make our lives better and we want to learn and we want to adapt this. So, generally, that went pretty well and people have grabbed it and really embraced the opportunity. And when it comes to hiring, if you only target people who are working with your stack, that technology is already, like your target addressable market is super small even today, but generally, we are looking for different attitudes and abilities, we are not restricting it to just the people who worked on that. 

Boaz: What data volumes are you guys dealing with? For example, in BigQuery, how much data are you looking at? 

Jonathan: I believe there are about five and a half petabytes. I mean we ingest somewhere between like 12 and 15 billion events a day. And, then in terms of kind of consumption, we have got about one and a half thousand Looker users and they are running about 40,000 queries a day on top of that data. 

Boaz: Wow. That is impressive. And, so what kinds of use cases are running on top of the platform? 

Jonathan: A variety of things the most digestible of which is the portfolio reporting. So the Northstar KPIs that the whole business uses to understand retention engagement, acquisition, monetization, are the default standard of this is how these are the gold standard measure of those things. Then, there is this deep-dive analysis in the games themselves looking at Live-Ops performance, level progression, characters, battles, depending on the genre. We have user acquisition and ads use cases. So whether it is managing the sort of budget, allocation with UA campaigns or actually we are also powering the forwarding of other events onto UA networks and things like that. 

Boaz: I stop you right here for a second. Across these three use cases, for example, how are the people split? Are we talking about analysts embedded in the respective departments? Are we talking about and then joint horizontal data engineering that takes care of, particularly under the hood, or walk us through the human structure.

Jonathan: Got yes. Our sort of centralized structure is around the data infrastructure, fundamentally running the platforms that power the ingestion, the power, how we think about, the relationship between AWS and GCP. Then there is the core BI team, their job is to build those kinds of baseline abstractions, those things that power those portfolios, KPIs, for example, but also the kind of building blocks that other people can extend, Looker explores and things like that, the responsibility of that team. 

Boaz: So, for example, if one of the Northstar KPIs, you want to introduce a new one, which people are involved? Who is the analyst that defines the KPI and where he or she sits. 

Jonathan: Fundamentally, the engineering will be done by the core BI team and the guidance of that team comes from the product manager of that team. And, they partner closely with this kind of strategic analytics organization, which is not a part of my organization, is part of the commercial side of the business and their job, part of what they do is thinking about what we think about what success looks like in Scopely and how do we measure that? And so typically it will be then just the driving that KPI, for example, at the moment we are working on bringing kind of a better lens to reactivation of players. So they are the ones thinking about the definition of that. And, then my core BI team will actually be the ones who are going to build that. 

Boaz: Amazing. So, in the core BI team, there are both data engineering skill sets and BI skill sets?

Jonathan:  Yeah. And, like all the technical roles generally called analytics engineers, and in line with that kind of move towards people who can move pretty seamlessly between building data models or writing LookML and constructing Looker products. So, daily people have different specializations, but our mission is to have people who are pretty able to move across the three DBT. 

Boaz: Okay. So, it is not like, person A stops at BigQuery, and person B starts at Looker. You have the same people who could do both. 

Jonathan: Yeah. And like, we have come from that model where it was more of a kind of relay handover and more and more I am trying to bring it together so there is greater diversity across. 

Boaz: Interesting. Do you consider the transition over? Is it done? Is it still in progress? How much is left?

Jonathan: The sort of backbone of the transition is done. From the moment we signed a deal with Google to kind of first business value was three months. So, we worked backward through the migration. So we moved all whether that was table dashboards that still existed at that point, Looker, or direct SQL query access. All of that happened first so that we could take the pressure off teams, kind of keeping the lights on the old stack and delivering value as early as possible relative to the meter that started running. And, then we worked back upstream through ETL. So, all of our ETL, whether it is kind of game-specific or the core side of things, was all migrated in less than 12 months. We have deprecated pretty much all of the AWS side of things. And the only thing, I would say it is not so much related to migration, but the thing we are working on now is Now streaming ingestion directly into BigQuery so that the data is as fresh as possible and as fast as possible.

Boaz: What other new initiatives are lined out.  Where do you want to see Scopely as the platform a year from now?

Jonathan: We have got a couple of big focuses, but I think one of the first ones at this point like I am super happy with the scalability of our platform. It does what it needs to do. We will scale naturally as Scopely grows and we add more and more use cases, more games, more studios, more data. Focus now is kind of on quality and observability. So, we can do great things for the data, but we rely heavily on that data being clean, and fundamentally that comes a lot down to the sort of tracking side of things. And at the moment, it is pretty easy for a game to implement their tracking completely incorrectly, and then we will throw analytics engineers at the problems to tidy it all up, and get into the shape we need it. So, we are going kind of upstream now and looking at the way we do tracking whether it is the semantics or the kind of the tooling that we give to game teams to help with that so they can get it easily, right like the first time, and the kind of time to insight, which is kind of another, my Northstar metrics is as quick as possible.

Boaz: But, how are you literally doing that? Are you using any tools, libraries in these new approaches? How are you going about that change?

Jonathan: Some of it is about education in the process, some of it is about building tooling. For example, with that focusing on a sort of streaming ingestion into BigQuery, the faster data available in BigQuery, the more we can leverage tools like Looker to enable people to QA and inspect their events, using the same business logic. Before it was a bit like, if you want to see the raw event, in real-time here, you can, but if you want basic business logic as well. Well, that is a batch, that is over here. We are bringing those two worlds together. But, we are also looking at, I mean, we have had conversations with a variety of vendors in the space over the years. No decisions have been made, but I think as interesting companies like Montecarlo and between them are attacking this thing from a different angle, either looking at commoditizing how companies build tracking plans and making that super easy, rather than having to build that yourself or the Montecarlo side, that observability piece that plugs into your existing stack. So, all of these things are kind of we are looking at them and working out, where the kind of build, the tradeoff is and what it is that is the kind of Scopely singularity about the problem.

Boaz: Yeah. I mean, definitely looking into quality observability is taking the data engineering world by storm so to say with vendors like Montecarlo, Databand, many others.  It is interesting to follow how that plays out and who sort of will stay on top and, but most importantly, that seeing everybody adopting these processes and mindsets. Because at the end of the day, it is something that can make a difference. So, much time is wasted by figuring things out downstream when they are wrong. It is a natural next step for our market, I guess. 

Jonathan: Absolutely. And it is great to see and it is interesting working with software engineers and noticed that when they are talking about observability of platforms, three to five years ago, and then you start to see the same thing crop up in data engineering certainly afterward. I think there is this kind of virtuous cycle of these patterns that come over from software engineering and get applied to data engineering to our use cases as well.

Boaz: What about data science? Is there a dedicated science department? Tell us a little bit about that?

Jonathan: Yeah. We have a data science department. They work principally on our kind of predictive offering of the platform. So, they are using machine learning, many working with Dataproc on GCP to do things like LTV predictions, churn predictions but we are moving more and more to a world where we do not just kind of predict what is going to happen but prescribe what would be the best course of action to avoid the negative outcome as well?

Boaz: How big is that?

Jonathan: Data science is relatively small. It is like less than 10 people. But working with software engineers and certain product designers around them to build these products.

Boaz: Well, you have mentioned earlier there are 60 people dealing with data, I guess. What sort of processes are in place to stay horizontally aligned. Is there sort of a weekly Guild meeting or anything like that, how do you make sure everybody knows what is happening around with data? 

Jonathan: Yeah. So, with these kinds of embedded teams, we have a matrix management structure. So, they both report into the games, for example, so that their direction is coming mainly from the roadmap of the games and then the horizontal management is kind of looking at that and making sure we move in a consistent, coherent kind of way. There are always challenges when you have got kind of two dimensions competing, but by and large, I think we have done a pretty good job, especially with things like the degree migration, for example, coordinating that across multiple embedded teams and doing it in a consistent way, kind of definitely stress-tested in that organization and the outcome has been pretty good, I would say. 

Boaz: Awesome. Specifically, what does your direct team look like on the platform? 

Jonathan: Typically, I have a few directors who report to me, both those horizontal leads that we are talking about or directors of analytics engineering working in core BI, for example, and the leadership of the data infrastructure team. Plus I have, a group of product managers who report to me who cover a variety of things from, core data products to data applications, going back to that point about the Playgami platform, the most visible product of the Playgami platform is Playgami console, which is the fundamentally the UI that you go to work with everything that a game team needs. So, how we embed data applications, capabilities into that, that also got product focus as well.

Boaz: Awesome. So within that journey from Scopely in the last two years, maybe specifically with the GCP move or in general, what in retrospect, if you rewind, would you have avoided? What would you do a little bit differently to make it even smoother?

Jonathan: Great question. Well, I think one thing was when we were migrating, whether we were doing the moving things like tableau dashboards and Looker products or game-specific data pipelines, or core data pipelines that created bottlenecks and pressure on different profiles and parts of the team. If someone is migrating, your ETL is not building the things you need to leverage in Looker, which affects roadmaps. By and large, I think we got the balance okay. But, one of the reasons, one of the big drivers for why we were moving to this analytics engineering mindset and having people who can kind of reverse the whole stack more easily is to avoid that, and do not intend to migrate daily warehouses every couple of years. But, it brought home that point around bottlenecks and kind of linear dependencies and making sure. So, if I could do it again, it would have been, I am not sure the circumstances of that to do it, but it would have been great to kind of get that analytics engineering thing first and then the stack second but that is still an argument. That is a bit the sort of tail wagging the dog.

Boaz: Yeah. You mentioned you had been using Google at King as well and in between, you know, a few years past, and then you took on Google again at Scopely. How did you feel the platform's improvement over those years? What is better at the GCP stack today versus when you were still at King?

Jonathan: One of the big things I guess, is when I was still at King, Luca was still an independent company, and I was not super surprised when the acquisition happened. But you can really see now how those things are converging and complementing each other. I would say there has also been progress made in a bunch of different areas. Just this, the simple things like the BigQuery UI and how easy that is to work within, but also some of the things under the hood around, security and privacy permissions, all of these things have come on some since I first started playing around with BigQuery what must have been good four or five years ago now.

Boaz: Okay. This has been super, super interesting, Jonathan, I cannot thank you enough, maybe a good note to finish with would be what advice would you have for people like yourself who are not educated with the technical background and are considering a move into tech?

Jonathan: Great question. One thing I think is super cool now, which did not happen so much at the start of my career, but you have so many Bootcamp companies now who can help accelerate that and take different views on the problem, both organizations like Codop for example, who works with women and underrepresented minorities, there are so many doors that are opening now. So I think that is one avenue. The other thing I would say is working or aiming at companies that use them on the data stack. They use tools like DBT or BigQuery or snowflake, for example, and they tend to be startups, and those companies tend to move super fast, super agile like in my experience, someone at the start of their career can learn more in a year working in that sort of environment than they can five years working in a sort of big monolithic legacy organization. So I would aim for those roles, and they also tend to be much more open-minded about skill, I think, in their hiring as well. 

Boaz: Amazing. Great piece of advice. I wish I could go back in time, study history, and take your advice. 

Jonathan: Also by accident rather than by design but it has worked out okay. 

Boaz:  Jonathan, thank you so much. 

Jonathan: Thank you.

Boaz: See you around.

Thank you for listening, everybody.

Read all the posts

Intrigued? Want to read some more?