GraphQL API Security: Query Complexity and Resolver-Level Protection
How to secure GraphQL APIs against resource-intensive queries, nested attacks, and unauthorized data access through query cost analysis and resolver guards.
Jean-Pierre Broeders
Freelance DevOps Engineer
GraphQL API Security: Query Complexity and Resolver-Level Protection
REST APIs have their own security challenges, but GraphQL introduces an entirely new set of risks. The flexibility GraphQL offers — clients can specify exactly what data they want — also makes it vulnerable to abuse.
A malicious user can craft queries that bring servers to their knees. Think queries with massive nesting depths, or requests fetching thousands of relationships in a single call. GraphQL has no built-in protection against these types of attacks.
The Problem: Query Complexity Attacks
A typical GraphQL schema has relationships between types. For example:
type User {
id: ID!
name: String!
posts: [Post!]!
}
type Post {
id: ID!
title: String!
author: User!
comments: [Comment!]!
}
type Comment {
id: ID!
text: String!
author: User!
}
Nothing wrong with this schema. But what if a client sends this query?
{
users {
posts {
comments {
author {
posts {
comments {
author {
posts {
comments {
text
}
}
}
}
}
}
}
}
}
}
This escalates exponentially. Each nesting layer multiplies the number of database queries. With 100 users, 50 posts per user, 20 comments per post... your CPU melts.
Solution 1: Query Depth Limiting
First line of defense: limit how many levels deep a query can go.
import { createComplexityLimitRule } from 'graphql-validation-complexity';
const depthLimit = createComplexityLimitRule({
maxDepth: 5,
onCost: (cost) => {
console.log('Query depth:', cost);
}
});
const server = new ApolloServer({
schema,
validationRules: [depthLimit]
});
Simple but effective. Queries deeper than 5 levels get rejected before they even execute. For most use cases, this is more than enough.
Solution 2: Query Cost Analysis
Depth limiting is crude. Some queries are deep but cheap (a user → profile → settings chain is fine). Others are flat but expensive (fetching all posts from all users).
Query cost analysis assigns each field a "cost" and counts the total:
import { createComplexityLimitRule } from 'graphql-validation-complexity';
const costLimit = createComplexityLimitRule({
scalarCost: 1,
objectCost: 5,
listFactor: 10,
maxCost: 5000
});
const server = new ApolloServer({
schema,
validationRules: [costLimit],
plugins: [
{
requestDidStart() {
return {
validationDidStart(requestContext) {
console.log('Query complexity:', requestContext.metrics.queryPlanningTime);
}
}
}
}
]
});
List fields get higher costs. Lists within lists escalate quickly. The server refuses queries above the threshold.
This works well but requires fine-tuning. Too strict and you block legitimate use cases. Too lenient and attackers slip through.
Solution 3: Resolver-Level Authorization
GraphQL resolvers are where the actual data access happens. This is where authentication AND authorization must occur — not just at the endpoint level.
A common mistake:
// WRONG: only top-level check
app.use('/graphql', authenticate);
const resolvers = {
Query: {
users: () => User.findAll(), // No check!
posts: () => Post.findAll() // No check!
}
};
You block unauthenticated requests at the endpoint, but once inside, users can query EVERYTHING. GraphQL introspection exposes the entire schema.
Better:
const resolvers = {
Query: {
users: async (parent, args, context) => {
if (!context.user?.isAdmin) {
throw new ForbiddenError('Admin access required');
}
return User.findAll();
},
me: async (parent, args, context) => {
if (!context.user) {
throw new AuthenticationError('Not authenticated');
}
return User.findById(context.user.id);
}
},
User: {
email: (user, args, context) => {
// Email only visible to the user themselves or admins
if (context.user?.id === user.id || context.user?.isAdmin) {
return user.email;
}
return null;
}
}
};
Each resolver checks permissions. Field-level granularity. No data leaks outside the permitted scope.
Solution 4: Rate Limiting at Resolver Level
Standard endpoint-level rate limiting doesn't work well with GraphQL. Every request hits /graphql, so you're limiting all queries equally — while some are heavy and others light.
Better: limit per resolver or per query complexity.
import rateLimit from 'express-rate-limit';
import RedisStore from 'rate-limit-redis';
const complexityLimiter = rateLimit({
store: new RedisStore({ client: redisClient }),
windowMs: 15 * 60 * 1000, // 15 minutes
max: (req) => {
// Dynamic limit based on query complexity
const complexity = req.body.extensions?.complexity || 100;
return Math.max(1, Math.floor(10000 / complexity));
},
keyGenerator: (req) => req.user?.id || req.ip
});
app.use('/graphql', complexityLimiter);
Heavy queries count more toward your rate limit budget. Light queries? More allowed per time window.
Practical Trade-offs
No approach is perfect. Here are the trade-offs:
| Method | Pros | Cons |
|---|---|---|
| Depth Limiting | Simple to implement, low overhead | Blocks valid deep queries |
| Cost Analysis | More accurate resource estimation | Complex configuration, hard to tune |
| Resolver Auth | Granular access control | Lots of boilerplate code |
| Dynamic Rate Limiting | Fair resource distribution | Requires state (Redis), more complex |
For most production setups: combine depth limiting (hard max) with cost analysis (soft max) and field-level authorization. Rate limiting is nice-to-have but not essential if you have complexity well under control.
Monitoring and Alerting
Security without visibility is flying blind. Log query complexity metrics:
const server = new ApolloServer({
schema,
plugins: [
{
requestDidStart() {
return {
executionDidStart() {
return {
willResolveField({ info }) {
const start = Date.now();
return () => {
const elapsed = Date.now() - start;
if (elapsed > 1000) {
console.warn(`Slow resolver: ${info.parentType}.${info.fieldName} (${elapsed}ms)`);
}
};
}
};
}
};
}
}
]
});
Track which resolvers are slow. Alert on abnormal query patterns. Identify bottlenecks before they cause problems.
Conclusion
GraphQL offers powerful flexibility, but that comes with responsibility. Depth limits prevent the most obvious attacks. Cost analysis gives fine-grained control. Resolver-level authorization ensures data doesn't leak. Rate limiting keeps abuse in check.
Most security issues arise not from inadequate tools, but from lack of awareness. GraphQL is secure when you understand where the risks lie — and proactively protect against resource exhaustion and unauthorized access.
Implement these defensive layers, monitor actively, and adjust based on real usage patterns. That scales better than firefighting after the fact.
