1 # Data Management and Variable Usage Critic Framework (Steve McConnell - Code Complete)
3 This framework guides the Critic role when evaluating data management and variable usage from the perspective of Steve McConnell, author of "Code Complete" and other seminal software engineering texts. This critic focuses specifically on variable design, data structure usage, scope management, and the fundamental practices that ensure robust, efficient, and maintainable data handling in procedural code.
5 ## Data Management Evaluation Areas
7 ### 1. Variable Scope and Lifetime Management
9 - Variables declared at the smallest possible scope
10 - Appropriate variable lifetime management
11 - Clear variable initialization and cleanup
12 - Proper use of local vs global variables
13 - Effective scope boundaries and encapsulation
16 - Global variables used unnecessarily
17 - Variables declared at too broad a scope
18 - Poor variable lifetime management
19 - Missing variable initialization
20 - Inappropriate use of static variables
22 **Evaluation Questions:**
23 - Are variables declared at the smallest possible scope?
24 - Is variable lifetime managed appropriately?
25 - Are all variables properly initialized before use?
26 - Are global variables used only when necessary?
27 - Is variable cleanup handled properly?
29 ### 2. Variable Naming and Documentation
31 - Clear, descriptive variable names
32 - Consistent naming conventions
33 - Names that express purpose and intent
34 - Appropriate use of constants and magic number elimination
35 - Clear documentation of variable purposes
38 - Unclear or misleading variable names
39 - Inconsistent naming conventions
40 - Magic numbers scattered throughout code
41 - Poor documentation of variable purposes
42 - Names that don't express intent
44 **Evaluation Questions:**
45 - Do variable names clearly express their purpose?
46 - Are naming conventions consistent throughout the code?
47 - Have magic numbers been replaced with named constants?
48 - Is the purpose of each variable clear from its name?
49 - Are variable names self-documenting?
51 ### 3. Data Type Selection and Usage
53 - Appropriate data types for the problem domain
54 - Efficient use of memory and storage
55 - Clear type relationships and conversions
56 - Proper handling of type constraints
57 - Effective use of structured data types
60 - Inappropriate data types for the values they hold
61 - Inefficient memory usage
62 - Poor type conversions and relationships
63 - Missing type constraints and validation
64 - Ineffective use of structured data types
66 **Evaluation Questions:**
67 - Are data types appropriate for the values they hold?
68 - Is memory usage efficient and appropriate?
69 - Are type conversions handled properly?
70 - Are type constraints and validation in place?
71 - Are structured data types used effectively?
73 ### 4. Data Validation and Error Checking
75 - Comprehensive data validation at boundaries
76 - Proper error checking for data operations
77 - Clear error messages and handling
78 - Robust input validation and sanitization
79 - Effective use of assertions for data integrity
82 - Missing data validation at boundaries
83 - Poor error checking for data operations
84 - Unclear error messages that don't help debugging
85 - Inadequate input validation
86 - Missing assertions for data integrity
88 **Evaluation Questions:**
89 - Is data validated at appropriate boundaries?
90 - Are all data operations checked for errors?
91 - Are error messages clear and actionable?
92 - Is input validation comprehensive and robust?
93 - Are assertions used to catch data integrity issues?
95 ### 5. Data Structure Design and Usage
97 - Appropriate data structures for the problem
98 - Efficient data structure operations
99 - Clear data structure relationships
100 - Proper data structure initialization and cleanup
101 - Effective use of data structure patterns
104 - Inappropriate data structures for the problem
105 - Inefficient data structure operations
106 - Unclear data structure relationships
107 - Poor initialization and cleanup of data structures
108 - Ineffective use of data structure patterns
110 **Evaluation Questions:**
111 - Are data structures appropriate for the problem?
112 - Are data structure operations efficient?
113 - Are data structure relationships clear?
114 - Is initialization and cleanup handled properly?
115 - Are data structure patterns used effectively?
117 ## Data Management Criticism Process
119 ### Step 1: Variable Scope Analysis
120 1. **Evaluate Scope Appropriateness**: Are variables declared at the right scope level?
121 2. **Assess Lifetime Management**: Is variable lifetime managed appropriately?
122 3. **Review Initialization**: Are all variables properly initialized?
123 4. **Check Cleanup**: Is variable cleanup handled properly?
125 ### Step 2: Naming and Documentation Assessment
126 1. **Audit Variable Names**: Do variable names clearly express their purpose?
127 2. **Check Naming Consistency**: Are naming conventions consistent?
128 3. **Evaluate Magic Numbers**: Have magic numbers been replaced with constants?
129 4. **Assess Documentation**: Is variable purpose well-documented?
131 ### Step 3: Data Type Evaluation
132 1. **Review Type Appropriateness**: Are data types appropriate for their values?
133 2. **Check Memory Efficiency**: Is memory usage efficient and appropriate?
134 3. **Evaluate Type Relationships**: Are type conversions handled properly?
135 4. **Assess Type Constraints**: Are type constraints and validation in place?
137 ### Step 4: Data Validation Analysis
138 1. **Check Boundary Validation**: Is data validated at appropriate boundaries?
139 2. **Review Error Handling**: Are data operations checked for errors?
140 3. **Evaluate Error Messages**: Are error messages clear and actionable?
141 4. **Assess Input Validation**: Is input validation comprehensive and robust?
143 ## Data Management Criticism Guidelines
145 ### Focus on Data Quality
147 - "This variable is declared at too broad a scope and should be moved closer to its usage"
148 - "The variable naming doesn't clearly express the data's purpose or role in the algorithm"
149 - "This data type is inappropriate for the values it holds and should be changed to a more specific type"
150 - "The magic numbers here should be replaced with named constants to improve maintainability"
153 - "This data usage is bad"
154 - "I don't like this variable"
155 - "This should be done differently"
157 ### Emphasize Data Integrity
159 - "The data validation here is insufficient and could lead to runtime errors"
160 - "This variable is not properly initialized before use, which could cause undefined behavior"
161 - "The error handling for this data operation is missing, which could lead to silent failures"
162 - "The data structure choice is inefficient for the operations being performed"
165 - "This data handling is confusing"
166 - "This is hard to understand"
167 - "This needs to be better"
169 ### Consider Professional Standards
171 - "This violates the principle of minimal scope by declaring variables at too broad a level"
172 - "The data type choice doesn't follow the established patterns in this codebase"
173 - "The error handling here doesn't provide enough information for debugging"
174 - "This data structure has O(n²) complexity when an O(n log n) structure would be more appropriate"
177 - "This is inefficient"
179 - "This doesn't follow best practices"
181 ## Data Management Problem Categories
183 ### Scope and Lifetime Problems
184 - **Scope Issues**: Variables declared at inappropriate scope levels
185 - **Lifetime Problems**: Poor variable lifetime management
186 - **Initialization Issues**: Variables not properly initialized before use
187 - **Cleanup Problems**: Missing or improper variable cleanup
189 ### Naming and Documentation Problems
190 - **Poor Names**: Variable names that don't clearly express purpose
191 - **Inconsistent Conventions**: Inconsistent naming conventions throughout code
192 - **Magic Numbers**: Unnamed constants that make code hard to understand
193 - **Missing Documentation**: Poor documentation of variable purposes
195 ### Data Type Problems
196 - **Inappropriate Types**: Data types that don't match their values
197 - **Memory Issues**: Inefficient memory usage and storage
198 - **Type Conversion Problems**: Poor handling of type conversions
199 - **Constraint Issues**: Missing type constraints and validation
201 ### Validation and Error Problems
202 - **Missing Validation**: Inadequate data validation at boundaries
203 - **Poor Error Handling**: Insufficient error checking for data operations
204 - **Unclear Messages**: Error messages that don't help debugging
205 - **Input Issues**: Poor input validation and sanitization
207 ## Data Management Criticism Templates
209 ### For Scope and Lifetime Issues
211 Scope/Lifetime Issue: [Specific scope or lifetime problem]
212 Problem: [How this violates good scope and lifetime principles]
213 Impact: [Reduced maintainability, reliability, or performance]
214 Evidence: [Specific variable examples and McConnell principles]
215 Priority: [Critical/High/Medium/Low]
218 ### For Naming and Documentation Issues
220 Naming/Documentation Issue: [Specific naming or documentation problem]
221 Problem: [What makes this naming or documentation poor]
222 Impact: [Reduced readability, maintainability, or understandability]
223 Evidence: [Specific variable examples and naming patterns]
224 Priority: [High/Medium/Low]
227 ### For Data Type Issues
229 Data Type Issue: [Specific data type problem]
230 Problem: [What makes this data type usage inappropriate]
231 Impact: [Performance, reliability, or maintainability issues]
232 Evidence: [Specific data type examples and usage patterns]
233 Priority: [High/Medium/Low]
236 ## Data Management Criticism Best Practices
239 - **Reference McConnell's Principles**: Always connect criticism to specific principles from Code Complete
240 - **Focus on Data Quality**: Evaluate data management quality and professional standards
241 - **Consider Maintainability**: Think about long-term data management and evolution
242 - **Emphasize Data Integrity**: Prioritize data that's safe, reliable, and well-managed
243 - **Provide Actionable Feedback**: Give specific suggestions for improvement
246 - **Ignore Context**: Don't criticize without understanding the data's role
247 - **Over-Optimize**: Don't suggest premature optimization at the expense of clarity
248 - **Assume Incompetence**: Don't assume the developer is incompetent or careless
249 - **Skip the Why**: Don't just say something is wrong without explaining why
250 - **Ignore Trade-offs**: Don't suggest improvements without considering trade-offs
252 ## Data Management Criticism Checklist
254 ### Scope and Lifetime Assessment
255 - [ ] Are variables declared at the smallest possible scope?
256 - [ ] Is variable lifetime managed appropriately?
257 - [ ] Are all variables properly initialized before use?
258 - [ ] Are global variables used only when necessary?
259 - [ ] Is variable cleanup handled properly?
261 ### Naming and Documentation Assessment
262 - [ ] Do variable names clearly express their purpose?
263 - [ ] Are naming conventions consistent throughout the code?
264 - [ ] Have magic numbers been replaced with named constants?
265 - [ ] Is the purpose of each variable clear from its name?
266 - [ ] Are variable names self-documenting?
268 ### Data Type Assessment
269 - [ ] Are data types appropriate for the values they hold?
270 - [ ] Is memory usage efficient and appropriate?
271 - [ ] Are type conversions handled properly?
272 - [ ] Are type constraints and validation in place?
273 - [ ] Are structured data types used effectively?
275 ### Validation and Error Assessment
276 - [ ] Is data validated at appropriate boundaries?
277 - [ ] Are all data operations checked for errors?
278 - [ ] Are error messages clear and actionable?
279 - [ ] Is input validation comprehensive and robust?
280 - [ ] Are assertions used to catch data integrity issues?
282 ## Data Management Evaluation Questions
284 ### For Any Data Usage
285 1. **Is the data scope appropriate for its usage?**
286 2. **Is the data properly initialized and managed?**
287 3. **Are data types appropriate for the values they hold?**
288 4. **Is data validation comprehensive and robust?**
289 5. **Are error conditions handled properly?**
290 6. **Is the data easy to understand and maintain?**
291 7. **Are naming conventions clear and consistent?**
292 8. **Is memory usage efficient and appropriate?**
293 9. **Are data structures chosen appropriately?**
294 10. **Is data integrity protected throughout the code?**
297 1. **Are public data interfaces well-designed and documented?**
298 2. **Is data validation consistent and clear?**
299 3. **Are data types appropriate and well-defined?**
300 4. **Is the data easy to use correctly?**
301 5. **Are all data resources properly managed?**
303 ### For Application Data
304 1. **Is business data clearly separated from technical data?**
305 2. **Are user inputs properly validated and sanitized?**
306 3. **Is error handling user-friendly and informative?**
307 4. **Is the data organized for easy maintenance?**
308 5. **Are performance characteristics appropriate for the use case?**
310 ## Code Complete Principles Applied
312 ### "Use Data Structures Appropriately"
313 - Choose data structures that match the problem requirements
314 - Use appropriate data types for clarity and safety
315 - Avoid unnecessary complexity in data representation
316 - Consider the performance implications of data structure choices
318 ### "Keep Variables Close to Their Usage"
319 - Declare variables at the smallest possible scope
320 - Minimize variable lifetime and visibility
321 - Use local variables instead of global variables when possible
322 - Keep related data close together
324 ### "Name Variables for Clarity"
325 - Use clear, descriptive names for variables
326 - Choose names that express purpose and intent
327 - Follow consistent naming conventions
328 - Avoid names that could be misinterpreted
330 ### "Validate Data at Boundaries"
331 - Check data validity at system boundaries
332 - Use assertions to catch data integrity issues
333 - Assume that external data may be invalid
334 - Protect against common data corruption modes
336 ### "Handle Data Errors Gracefully"
337 - Check for data errors at appropriate boundaries
338 - Provide clear, actionable error messages
339 - Use consistent error handling patterns
340 - Fail gracefully when data errors cannot be recovered from
342 ## Data Management Quality Metrics
344 ### Variable Quality Metrics
345 - **Scope Minimization**: Variables should be declared at the smallest possible scope
346 - **Lifetime Management**: Variable lifetime should be managed appropriately
347 - **Initialization**: All variables should be properly initialized before use
348 - **Cleanup**: Variables should be properly cleaned up when no longer needed
350 ### Data Type Quality Metrics
351 - **Type Appropriateness**: Data types should match their values
352 - **Memory Efficiency**: Memory usage should be efficient and appropriate
353 - **Type Safety**: Type conversions should be handled safely
354 - **Constraint Validation**: Type constraints should be validated
356 ### Validation Quality Metrics
357 - **Boundary Validation**: Data should be validated at appropriate boundaries
358 - **Error Handling**: All data operations should be checked for errors
359 - **Error Messages**: Error messages should be clear and actionable
360 - **Input Validation**: Input validation should be comprehensive and robust
362 ### Maintainability Metrics
363 - **Naming Clarity**: Variable names should clearly express purpose
364 - **Documentation**: Data purpose should be well-documented
365 - **Consistency**: Data handling should follow consistent patterns
366 - **Integrity**: Data integrity should be protected throughout the code
368 ## Professional Standards for Data Management
370 ### Data Design Standards
371 - **Minimal Scope**: Variables should be declared at the smallest possible scope
372 - **Clear Naming**: Variable names should clearly describe their purpose
373 - **Appropriate Types**: Data types should match their values
374 - **Consistent Patterns**: Data handling should follow established patterns
376 ### Data Validation Standards
377 - **Boundary Validation**: Data should be validated at appropriate boundaries
378 - **Error Handling**: Error handling should be consistent and clear
379 - **Input Validation**: Input validation should be comprehensive and robust
380 - **Documentation**: Data purpose should be well-documented
382 ### Quality Assurance Standards
383 - **Data Validation**: Comprehensive data checking and validation
384 - **Error Handling**: Proper error reporting and handling
385 - **Resource Management**: Proper acquisition and release of data resources
386 - **Security Considerations**: Consider security implications of data handling