curriculum/challenges/english/blocks/quiz-data-structures/67f41341453c2247fb2828f7.md
To pass the quiz, you must correctly answer at least 18 of the 20 questions below.
What does Big O notation describe in algorithm analysis?
The exact runtime in seconds for a specific computer.
The percentage of code lines executed during a run.
How readable the code is to other developers.
How the time or space grows relative to input size (an upper bound).
When starting an algorithmic challenge, what is the best first step?
Begin coding immediately to gain momentum.
Optimize for performance before you understand the problem.
Write unit tests only after finishing the solution.
Clarify the problem and constraints with examples and edge cases.
What is the key difference between dynamic arrays and static arrays?
Dynamic arrays store values of different types; static arrays cannot.
Static arrays allow duplicate values; dynamic arrays do not.
Dynamic arrays are faster than static arrays for every operation.
Dynamic arrays can grow or shrink by resizing; static arrays have a fixed size.
What is the amortized time complexity of appending an element to the end of a dynamic array?
O(n)
O(log n)
O(n log n)
O(1) amortized.
Why does accessing the k-th element by index in a singly linked list take O(n) time?
The list must be resized before any access.
The index is hashed and looked up in a table.
Nodes are stored contiguously, so shifting is required.
You must traverse from the head node to the k-th node one by one.
Which feature does a doubly linked list have that a singly linked list does not?
Random access to any index in O(1) time.
A built-in array buffer for faster iteration.
Automatic maintenance of the list length as a constant.
Pointers to both next and previous nodes enabling backward traversal.
Which of the following best describes a stack?
First In, First Out (FIFO) with removals at the front.
A structure where any element can be removed in O(1) time.
A circular buffer with constant-time random access.
Last In, First Out (LIFO) with push and pop at the top.
Which operation removes the element at the front of a queue?
push
pop
peek
dequeue
What is the typical average-case time complexity to look up a value by key in a hash map?
O(n) because all keys must be scanned sequentially.
O(log n) due to binary search within buckets.
O(n log n) because keys are sorted during insertion.
O(1) on average with a good hash function and low load factor.
Which guarantee is provided by a set data structure?
Elements are stored in sorted order by default.
Duplicate values are allowed and kept together.
Elements are indexed by their insertion position.
It stores only unique elements (no duplicates).
In a dynamic array, what is the worst-case time complexity of inserting an element at index i (not at the end)?
O(1)
O(log n)
O(1) amortized
O(n)
What is the time complexity of inserting a new node at the head of a singly linked list?
O(n)
O(log n)
O(n log n)
O(1)
Which operation is used to remove an element from a stack?
push
dequeue
Insert at bottom.
pop
Which of the following best describes a queue?
Last In, First Out (LIFO) with removals at the top.
Random access to any index in O(1) time.
Elements are always kept in sorted order automatically.
First In, First Out (FIFO) with enqueue at the back and dequeue at the front.
What is a hash collision in a hash map?
When a key maps to multiple distinct values by design.
When two identical keys are stored in different buckets.
When the map runs out of memory and must be resized.
When two different keys produce the same hash index.
Why do hash maps resize (rehash) as they grow?
To sort keys in ascending order for faster iteration.
To compress values and reduce memory fragmentation.
To avoid triggering the language's garbage collector.
To keep the load factor low so that average operations remain O(1).
Which statement about sets is true?
Sets preserve insertion order by definition.
Sets allow duplicate elements and keep counts.
Set membership tests are O(n log n) on average.
Membership tests are typically O(1) on average.
Which time complexity grows faster than O(n log n) as n becomes large?
O(n)
O(log n)
O(1)
O(n^2)
After implementing a brute-force solution, what is a good next step?
Micro-optimize constant factors before measuring.
Discard tests and rewrite the solution from scratch.
Avoid considering edge cases to keep the code simple.
Analyze its time/space complexity and optimize identified bottlenecks.
What does space complexity measure?
How many CPU cores a program uses.
The length of a program in lines of code.
How long a program takes to compile.
How memory usage grows relative to input size.