Software Development Best Practices
Security
Software security is the idea of engineering software to function correctly, even under malicious attack.
Addressing security early helps avoid vulnerabilities. DevSecOps is a good example of this. However, security adds cost and time to development.
Terminology:
- Defects: Deviation from requirements.
- Bugs: Implementation-level errors that can be detected and removed.
- Flaws: Deeper problems instantiated in code or absent at the design level (e.g., poor error handling exposing sensitive data).
- Failures: Inability to perform a function.
- Vulnerability: An error an attacker can exploit.
Risk Management: Risk captures the probability that a flaw/bug will impact the software’s purpose. Risk = probability * impact.
Risk Management Framework: An overall approach to risk management. It identifies, ranks, tracks, and helps understand security risks. The goal is to track and handle risks effectively.
STRIDE Model for Threats:
- Spoofing identity
- Tampering with data
- Repudiation (denying an action, e.g., lying about a purchase)
- Information disclosure (exposure of confidential data)
- Denial of service (overwhelming a server with requests)
- Elevation of privilege
PASTA (Process for Attack Simulation and Threat Analysis): A 7-step methodology.
Prioritize and rank risks based on risk likelihood, risk impact, and the number of risks over time.
Mitigation:
- Mitigate Untrusted Data: Validate and sanitize inputs to prevent SQL injections and prevent XSS (scripts entered as malicious input).
- Mitigate Bad Authentication: Use server certificates.
- Mitigate Password Cracking: Account for brute-forcing and rainbow tables. Use salting and peppering.
Security Practices:
- Architectural risk analysis
- Code reviews
- Penetration testing (controlled and authorized simulation of a cyberattack – most common)
- Security operations (best practices such as backups and logging)
UX/UI Design
Requirements define the system for the client, while design defines the system for the developer. UX merges services from multiple disciplines, forming the overall feel of the experience. UI focuses on how the interface looks and functions. Usability, a quality attribute of UI, determines ease of learning and use. UI development can be a major project component, and usability can be expensive in hardware.
Design Process:
- Define the problem
- Collect information
- Brainstorm ideas
- Develop solutions
- Present ideas to gain feedback
- Improve design
Prototypes: Preliminary versions used to get user feedback, create variations, and encourage communication (often uses wireframes).
UI Principles: UI is an art with general principles:
- Consistency (between different parts of the app, e.g., weight)
- Feedback
- Control of actions (interrupting/reversing)
- Error handling (clear error messages)
Constraints: Desktops, laptops, smartphones, and touchscreens have different constraints. Account for device performance (e.g., graphics), display sizes, keyboard/mouse availability, and connectivity.
Unpredictability: Data transfer over a network has unpredictable response times and requires visual feedback and cancellation ability.
Accessibility: Consider eyesight, hearing, and dexterity (bigger, more spaced-out buttons). Include testers of various ages.
Responsiveness: Less content can be displayed on smaller screens. Smartphones are often portrait. Touchscreens are different from mice. Virtual keyboards require space. A single site should adapt to any device using flexible grids, layouts, and CSS media queries.
Page Organization: Keep It Simple, Stupid (KISS). Hyperlinks are the basic building blocks of the web. Sites typically use a hierarchical tree structure. States are lost when pages are left. Android/iOS encourage a stack-based structure for states.
Navigation: Use simple mock-ups and consider various scenarios. Make it easy to navigate. Menus can be scrolling, hierarchical, or control panels and should be consistent, broad, and shallow.
Content: Text should be precise, unambiguous, and aligned. Graphics should be easy to comprehend and recognize, use color well, and vary for different cases.
Understand and adopt the development environment’s style. Users will be familiar with interfaces from other apps. When new OS versions are released, designs may need modifications.
Design Principles
While system architecture defines needed components, existing components, and their connections/protocols, design focuses on how components are developed—choices like data representation, interfaces, and class hierarchies.
Key Principles:
- KISS (Keep It Simple, Stupid): Avoid unnecessary complexity.
- YAGNI (You Ain’t Gonna Need It): Don’t develop features that aren’t currently required.
- DRY (Don’t Repeat Yourself): Every piece of knowledge should have a single, unambiguous representation.
SOLID Principles:
- Single Responsibility: Each class has one responsibility it does well.
- Open/Closed Principle: Open for extension, closed for modification. Add new functionality without modifying existing code (abstraction and polymorphism).
- Liskov Substitution Principle: Objects of a superclass should be replaceable with objects of a subclass.
- Interface Segregation Principle: No client should be forced to implement unused methods.
- Dependency Inversion/Injection: Inject dependencies (through constructors or setters) instead of hardcoding them.
DevOps
Automate everything and collaborate.
Key Practices:
- Continuous Integration (CI): Merge code frequently, automate builds and tests. Catch errors early and ensure a consistently deployable application.
- Continuous Delivery (CD): Automate releases for quick and reliable deployments.
- Infrastructure as Code (IaC): Manage infrastructure with code for consistent setups, reduced manual errors, and efficient scaling.
- Microservices: Design applications as small, independent services for flexibility and scalability (but increased complexity).
- Monitoring and Logging: Gain real-time performance insights to identify issues, improve UX, and aid debugging.
Testing
Remember, you’re testing the application, not the user. Design and evaluation should be done by different people. Understand how well users can learn and use the product. Evaluate during development, before launch, and after launch.
Usability Evaluation: Focuses on the quality of the user experience.
- Intuitiveness
- Learnability
- Effectiveness (accuracy and completeness of goal achievement)
- Efficiency (relation between effectiveness and resources used)
- Memorability
- Error frequency and severity
- Satisfaction (user comfort)
Evaluation Process: Gather data from users and HCI experts. Vary demographics (even five users can be helpful). Focus on typical tasks and usability goals. Collect qualitative (happiness, frustration) and quantitative (# of clicks, time to complete tasks) data. Conduct tests in controlled (lab) and natural environments. Use two evaluators per test. Give users a script or tasks, observe them, note errors and performance issues, and conduct post-test interviews.
For post-release testing, log events (clicks, navigation, keystrokes, help system use, errors). Gather human feedback (complaints, praise, bug reports, requests). Conduct A/B testing.
Code Review
Quality Assurance: Maintaining desired product properties through process choices.
- Testing: Running the program and inspecting results. Includes regression testing (ensuring new features don’t break existing ones), unit testing (testing smallest code components), and integration testing (testing component interaction).
Challenges of Testing: High cost, difficulty measuring software quality, incomplete/changing specifications, and the need for cross-product integration testing.
Verification:
- Static: Verification without execution (manual code reviews/inspections or with tools).
- Dynamic: Testing with trial data and debugging.
Pull Requests: Notify others of repository changes. Discuss and review changes, add comments, and even add commits. Be descriptive but concise, provide context and rationale, and include links/screenshots.
Code Review Rates: Design and code inspections are more effective than unit/integration testing. Reviews significantly reduce errors.
Review Types: Code inspection, passing code around, pair programming.
Code Review Goals: Find bugs without execution. Ensure code is maintainable, readable, well-formatted, documented, necessary, and follows conventions.
Code Inspection: Formal, structured meeting. Reviewers review work separately with documentation, then meet to identify issues. Review code line by line (150-250 lines in a 2-hour meeting) using a checklist (conditionals, loops, variable initialization, naming, memory allocation).
Code Review Timing: Start early.