12 essential data structures for Rust developers. Each one explained with Big O complexity, animated visuals, and real code samples you can copy.
[T; N]Fixed-size, contiguous block of memory. Elements are stored sequentially and accessed by index in constant time. The foundation of most other data structures.
arr[0] = 7
Complexity
When to use
// Fixed-size array [T; N] - stack-allocated
let mut nums: [i32; 5] = [10, 20, 30, 40, 50];
nums[0] = 99; // O(1) - direct index
let third = nums[2]; // O(1) - read by index
// Iterate and transform
let doubled: Vec<i32> = nums.iter().map(|x| x * 2).collect();
// Search and sort
nums.sort(); // O(n log n) - in-place
let idx = nums.binary_search(&30); // O(log n) - Result<usize, usize>
let has = nums.contains(&40); // O(n) - linear scan
let sum: i32 = nums.iter().sum(); // O(n)Vec<T>Resizable array that automatically grows when capacity is exceeded. The most commonly used data structure in most languages. Doubles its internal storage when full, giving amortized O(1) appends.
arr[0] = 7
Complexity
When to use
let mut names: Vec<String> = vec!["Alice".into(), "Bob".into()];
names.push("Charlie".into()); // O(1) amortized
names.insert(1, "Diana".into()); // O(n) - shifts elements right
names.remove(0); // O(n) - shifts elements left
names.pop(); // O(1) - remove last
let has = names.contains(&"Bob".into()); // O(n) - linear scan
let first = &names[0]; // O(1) - direct index
// Sort and dedup
names.sort(); // O(n log n)
names.dedup(); // O(n) - remove consecutive dups
names.retain(|n| n.len() > 3); // O(n) - filter in placeVec (push/pop)Last-In-First-Out (LIFO) collection. Only the top element is accessible. Used for tracking state that must be unwound in reverse order.
Push 10
Complexity
When to use
// Vec as stack - push/pop from the end
let mut stack: Vec<i32> = Vec::new();
stack.push(10); // O(1)
stack.push(20);
stack.push(30);
let top = stack.last(); // O(1) - peek -> Some(30)
let val = stack.pop(); // O(1) - pop -> Some(30)
// Interview pattern: valid parentheses
fn is_valid(s: &str) -> bool {
let mut st: Vec<char> = Vec::new();
for c in s.chars() {
match c {
'(' => st.push(')'),
'[' => st.push(']'),
'{' => st.push('}'),
_ => if st.pop() != Some(c) { return false; },
}
}
st.is_empty()
}VecDeque<T>First-In-First-Out (FIFO) collection. Elements are added at the back and removed from the front. Fundamental for breadth-first processing.
Enqueue 10
Complexity
When to use
use std::collections::VecDeque;
let mut queue: VecDeque<&str> = VecDeque::new();
queue.push_back("Task A"); // O(1) - enqueue
queue.push_back("Task B");
queue.push_back("Task C");
let front = queue.front(); // O(1) - peek -> Some("Task A")
let val = queue.pop_front(); // O(1) - dequeue -> Some("Task A")
// Interview pattern: BFS level-order
fn bfs(root: Option<Rc<RefCell<TreeNode>>>) -> Vec<Vec<i32>> {
let mut q = VecDeque::new();
let mut result = vec![];
if let Some(r) = root { q.push_back(r); }
while !q.is_empty() {
let level: Vec<i32> = (0..q.len())
.filter_map(|_| q.pop_front())
.inspect(|n| { /* push children */ })
.map(|n| n.borrow().val)
.collect();
result.push(level);
}
result
}HashMap<K,V>Maps keys to values using a hash function for near-constant-time lookups. The single most important data structure for coding interviews. Every language has a built-in implementation.
hash("age") = 0
Complexity
When to use
use std::collections::HashMap;
let mut map: HashMap<&str, i32> = HashMap::new();
map.insert("apple", 3); // O(1)
map.insert("banana", 5); // O(1)
map.insert("apple", 10); // O(1) - update
let has = map.contains_key("banana"); // O(1)
map.remove("banana"); // O(1)
// Safe lookup with default
let count = map.get("cherry").copied().unwrap_or(0); // O(1)
// Interview pattern: Two Sum
fn two_sum(nums: &[i32], target: i32) -> Vec<usize> {
let mut seen: HashMap<i32, usize> = HashMap::new();
for (i, &num) in nums.iter().enumerate() {
if let Some(&j) = seen.get(&(target - num)) {
return vec![j, i];
}
seen.insert(num, i);
}
vec![]
}HashSet<T>Unordered collection of unique elements. Uses hashing internally for O(1) membership testing. Supports mathematical set operations like union, intersection, and difference.
hash("age") = 0
Complexity
When to use
use std::collections::HashSet;
let mut set: HashSet<i32> = [1, 2, 3, 4, 5].into_iter().collect();
set.insert(6); // O(1) - true (added)
set.insert(3); // O(1) - false (duplicate)
set.remove(&1); // O(1)
let has = set.contains(&4); // O(1) - true
// Set operations
let other: HashSet<i32> = [4, 5, 6, 7].into_iter().collect();
let inter: HashSet<_> = set.intersection(&other).collect(); // {4, 5, 6}
let union: HashSet<_> = set.union(&other).collect();
let diff: HashSet<_> = set.difference(&other).collect();
// Interview pattern: contains duplicate
fn contains_duplicate(nums: &[i32]) -> bool {
let mut seen = HashSet::new();
nums.iter().any(|n| !seen.insert(n)) // insert returns false if dup
}LinkedList<T>Sequence of nodes where each node points to the next (singly linked) or both next and previous (doubly linked). Efficient insertion and deletion at any known position, but no index-based access.
traversing: 5
Complexity
When to use
use std::collections::LinkedList;
let mut list = LinkedList::new();
list.push_back(10); // O(1)
list.push_back(20);
list.push_front(5); // O(1)
let front = list.front(); // O(1) - peek -> Some(5)
let back = list.back(); // O(1) - peek -> Some(20)
list.pop_front(); // O(1) - remove front
list.pop_back(); // O(1) - remove back
let has = list.contains(&10); // O(n) - linear scan
let len = list.len(); // O(1)
// Note: std LinkedList has no index access - O(n) traversal only
// For interview problems, prefer defining a custom ListNode:
// struct ListNode { val: i32, next: Option<Box<ListNode>> }BTreeSet<T>Collection of unique elements maintained in sorted order, typically backed by a balanced binary search tree (red-black tree). Supports range queries and O(log n) min/max.
search(8)
Complexity
When to use
use std::collections::BTreeSet;
let mut sorted = BTreeSet::new();
for &v in &[5, 3, 8, 1, 9] { sorted.insert(v); }
// Maintains sorted order: {1, 3, 5, 8, 9} (B-tree)
sorted.insert(4); // O(log n)
sorted.remove(&3); // O(log n)
let has = sorted.contains(&8); // O(log n) - true
let min = sorted.iter().next(); // O(log n) - Some(1)
let max = sorted.iter().next_back(); // O(log n) - Some(9)
// Range queries - powerful for interval problems
let range: Vec<_> = sorted.range(4..9).collect(); // [4, 5, 8]
let range_incl: Vec<_> = sorted.range(4..=9).collect(); // [4, 5, 8, 9]BTreeMap<K,V>Key-value pairs maintained in sorted key order, typically backed by a balanced BST. Enables ordered iteration and range lookups that hash maps cannot provide.
search(8)
Complexity
When to use
use std::collections::BTreeMap;
let mut tm = BTreeMap::new();
tm.insert("banana", 2); // O(log n)
tm.insert("apple", 5);
tm.insert("cherry", 1);
tm.insert("date", 3);
let val = tm.get("apple"); // O(log n) -> Some(5)
tm.remove("cherry"); // O(log n)
// Iterates in sorted key order
for (key, val) in &tm {
println!("{key}: {val}"); // apple: 5, banana: 2, date: 3
}
// Range and navigation
let first = tm.iter().next(); // ("apple", 5)
let last = tm.iter().next_back(); // ("date", 3)
let sub: Vec<_> = tm.range("apple"..="cherry").collect();BinaryHeap<T>Collection where elements are dequeued by priority rather than insertion order. Typically implemented as a binary heap. Essential for shortest-path algorithms and top-K problems.
min-heap
Complexity
When to use
use std::collections::BinaryHeap;
// Max-heap by default
let mut heap = BinaryHeap::new();
heap.push(30); // O(log n)
heap.push(10);
heap.push(20);
let max = heap.peek(); // O(1) - Some(30)
let val = heap.pop(); // O(log n) - Some(30)
// Min-heap: wrap in Reverse
use std::cmp::Reverse;
let mut min_heap = BinaryHeap::new();
min_heap.push(Reverse(30));
min_heap.push(Reverse(10));
let min = min_heap.pop(); // Some(Reverse(10))
// Build heap from vec - O(n)
let heap = BinaryHeap::from(vec![5, 3, 8, 1, 9]);Mutex<HashMap> / DashMapThread-safe hash map designed for concurrent read/write access from multiple threads. Uses fine-grained locking or lock-free techniques instead of a single global lock.
hash("age") = 0
Complexity
When to use
use std::collections::HashMap;
use std::sync::{Arc, Mutex};
// Option 1: Mutex + HashMap (standard library)
let cache = Arc::new(Mutex::new(HashMap::<String, i32>::new()));
{
let mut map = cache.lock().unwrap(); // acquire lock
map.insert("hits".into(), 0); // O(1)
*map.entry("hits".into()).or_insert(0) += 1; // atomic increment
} // lock released here
// Option 2: dashmap crate (concurrent HashMap, no global lock)
// use dashmap::DashMap;
// let cache = DashMap::new();
// cache.insert("hits", 0); // O(1) - shard-level lock
// cache.entry("hits").and_modify(|v| *v += 1); // atomic update
// let val = cache.get("hits"); // O(1) - read lock on shard&[T] sliceZero-copy view over a contiguous region of memory. Lets you reference a portion of an array or buffer without allocating new memory. Critical for performance-sensitive parsing and processing.
Span[0..3] = [1, 2, 3]
Complexity
When to use
// Slices &[T] - zero-copy view into contiguous memory
let data = vec![1, 2, 3, 4, 5];
let slice: &[i32] = &data[1..4]; // O(1) - view of [2, 3, 4]
// Mutable slice
let mut buf = vec![10, 20, 30, 40];
let part: &mut [i32] = &mut buf[..2]; // O(1) - mutable view
part[0] = 99; // mutates original
// Split without allocation
let (left, right) = data.split_at(3); // O(1) - [1,2,3] and [4,5]
// Byte slices for binary parsing
let bytes: &[u8] = b"Hello, World!";
let header = &bytes[..5]; // O(1) - b"Hello"
let body = &bytes[7..]; // O(1) - b"World!"
// Chunks for batch processing
for chunk in data.chunks(2) { // [[1,2], [3,4], [5]]
println!("{chunk:?}");
}Average-case time complexity. * = amortized.
| Structure | Access | Search | Insert | Delete |
|---|---|---|---|---|
| Array | O(1) | O(n) | O(n) | O(n) |
| Dynamic Array | O(1) | O(n) | O(1)* | O(n) |
| Stack | O(n) | O(n) | O(1)* | O(1) |
| Queue | O(n) | O(n) | O(1)* | O(1) |
| Hash Map | O(1) | O(1) | O(1)* | O(1) |
| Hash Set | N/A | O(1) | O(1)* | O(1) |
| Linked List | O(n) | O(n) | O(1) | O(1) |
| Sorted Set | O(n) | O(log n) | O(log n) | O(log n) |
| Sorted Map | O(log n) | O(log n) | O(log n) | O(log n) |
| Priority Queue | O(n) | O(n) | O(log n) | O(log n) |
| Concurrent Map | O(1) | O(1) | O(1)* | O(1) |
| Memory View | O(1) | O(n) | N/A | N/A |
| I need to... | Use |
|---|---|
| Store items by index, resize dynamically | List / Dynamic Array |
| Map keys to values with O(1) lookup | HashMap / Dictionary |
| Track unique items, check existence in O(1) | HashSet / Set |
| Last-in-first-out (undo, DFS, brackets) | Stack |
| First-in-first-out (BFS, task queues) | Queue |
| Keep elements sorted at all times | SortedSet / TreeSet |
| Process items by priority (Dijkstra, top-K) | PriorityQueue / Heap |
| Insert/delete at a known position in O(1) | LinkedList |
| Sorted key-value pairs | SortedDictionary / TreeMap |
| Thread-safe shared cache | ConcurrentDictionary |
| Slice arrays/strings without copying | Span / Slice / memoryview |
The most commonly used are dynamic arrays (List/ArrayList/vector), hash maps (Dictionary/HashMap/dict), and hash sets. For interviews, also know stacks, queues, trees, and priority queues. These cover 90%+ of coding interview problems.
Start with the dynamic array and hash map. Together they solve the majority of interview problems. Then learn stacks (for DFS, bracket matching) and queues (for BFS). After that, tackle trees, heaps, and graphs.
No. Big O measures algorithmic complexity, not language-specific performance. A hash map lookup is O(1) whether you use Python dict, Java HashMap, or C# Dictionary. Constant factors differ (C++ is faster than Python in wall-clock time), but Big O is the same.
It depends on the language runtime. Check the Rust standard library documentation for heap or priority queue support.
Practice with an AI interviewer that asks you to implement, optimize, and explain data structure choices in real time.
Try a free mock interviewarrow_forward