实战项目¶
📚 章节概述¶
本章将通过一个完整的实战项目,综合运用前面所学的 Rust 知识。我们将构建一个命令行工具,涵盖文件操作、网络请求、错误处理、并发编程等多个方面。
🎯 项目目标¶
- 构建一个功能完整的 CLI 工具
- 综合运用 Rust 的核心特性
- 学习项目组织和模块化
- 掌握测试和文档编写
- 了解发布和部署流程
🛠️ 项目概述¶
1.1 项目简介¶
我们将构建一个 文件搜索和分析工具,具有以下功能:
- 递归搜索文件
- 按文件类型过滤
- 统计文件信息
- 并行处理提高性能
- 彩色输出和进度显示
1.2 技术栈¶
- Tokio:异步运行时
- Clap:命令行参数解析
- Colored:彩色输出
- Indicatif:进度条
- Rayon:并行迭代器
📦 项目结构¶
Text Only
file_analyzer/
├── Cargo.toml
├── src/
│ ├── main.rs
│ ├── cli.rs
│ ├── search.rs
│ ├── analyzer.rs
│ └── output.rs
├── tests/
│ └── integration_test.rs
└── README.md
🚀 项目实现¶
2.1 Cargo.toml¶
TOML
[package]
name = "file_analyzer"
version = "0.1.0"
edition = "2024"
authors = ["Your Name <you@example.com>"]
[dependencies]
tokio = { version = "1.0", features = ["full"] }
clap = { version = "4.0", features = ["derive"] }
colored = "2.0"
indicatif = "0.17"
rayon = "1.7"
walkdir = "2.3"
[dev-dependencies]
tempfile = "3.0"
2.2 CLI 模块¶
src/cli.rs
Rust
use clap::{Parser, Subcommand};
use colored::Colorize;
#[derive(Parser, Debug)]
#[command(name = "file_analyzer")]
#[command(about = "A powerful file analysis tool", long_about = None)]
pub struct Cli {
/// Search path
#[arg(short, long, default_value = ".")]
pub path: String,
/// File extension to filter
#[arg(short, long)]
pub extension: Option<String>,
/// Minimum file size in bytes
#[arg(short, long, default_value = "0")]
pub min_size: u64,
/// Maximum file size in bytes
#[arg(short, long)]
pub max_size: Option<u64>,
/// Enable parallel processing
#[arg(short, long)]
pub parallel: bool,
/// Show detailed information
#[arg(short, long)]
pub verbose: bool,
}
impl Cli {
pub fn print_info(&self) {
println!("{}", "File Analyzer".bold().cyan());
println!("Path: {}", self.path);
if let Some(ext) = &self.extension {
println!("Extension: {}", ext);
}
println!("Min size: {} bytes", self.min_size);
if let Some(max) = self.max_size {
println!("Max size: {} bytes", max);
}
println!("Parallel: {}", self.parallel);
println!("Verbose: {}", self.verbose);
}
}
2.3 搜索模块¶
src/search.rs
Rust
use std::path::{Path, PathBuf};
use walkdir::WalkDir;
use colored::Colorize;
#[derive(Clone)]
pub struct FileEntry {
pub path: PathBuf,
pub size: u64,
pub extension: Option<String>,
}
pub struct SearchOptions {
pub extension: Option<String>,
pub min_size: u64,
pub max_size: Option<u64>,
}
impl FileEntry {
pub fn new(path: PathBuf) -> Self {
let size = path.metadata()
.map(|m| m.len())
.unwrap_or(0);
let extension = path.extension()
.and_then(|e| e.to_str())
.map(|s| s.to_lowercase());
FileEntry {
path,
size,
extension,
}
}
pub fn matches(&self, options: &SearchOptions) -> bool {
// Check size
if self.size < options.min_size {
return false;
}
if let Some(max) = options.max_size {
if self.size > max {
return false;
}
}
// Check extension
if let Some(ref ext) = options.extension {
match &self.extension {
Some(file_ext) if file_ext == ext => true,
None => false,
_ => false,
}
} else {
true
}
}
pub fn display(&self, verbose: bool) {
let size_str = format_size(self.size);
if verbose {
println!(
"{} {} {}",
self.path.display().to_string().green(),
size_str.yellow(),
self.extension.as_deref().unwrap_or("none").cyan()
);
} else {
println!(
"{} {}",
self.path.display().to_string().green(),
size_str.yellow()
);
}
}
}
fn format_size(size: u64) -> String {
const KB: u64 = 1024;
const MB: u64 = KB * 1024;
const GB: u64 = MB * 1024;
if size >= GB {
format!("{:.2} GB", size as f64 / GB as f64)
} else if size >= MB {
format!("{:.2} MB", size as f64 / MB as f64)
} else if size >= KB {
format!("{:.2} KB", size as f64 / KB as f64)
} else {
format!("{} B", size)
}
}
pub fn search_files(
path: &Path,
options: &SearchOptions,
) -> Vec<FileEntry> {
let mut files = Vec::new();
for entry in WalkDir::new(path)
.into_iter()
.filter_map(|e| e.ok())
.filter(|e| e.file_type().is_file())
{
let file = FileEntry::new(entry.path().to_path_buf());
if file.matches(options) {
files.push(file);
}
}
files
}
2.4 分析器模块¶
src/analyzer.rs
Rust
use std::collections::HashMap;
use std::path::Path;
use colored::Colorize;
use crate::search::{FileEntry, SearchOptions, search_files};
pub struct AnalysisResult {
pub total_files: usize,
pub total_size: u64,
pub extension_counts: HashMap<String, usize>,
pub largest_files: Vec<FileEntry>,
}
impl AnalysisResult {
pub fn new() -> Self {
AnalysisResult {
total_files: 0,
total_size: 0,
extension_counts: HashMap::new(),
largest_files: Vec::new(),
}
}
pub fn add_file(&mut self, file: &FileEntry) {
self.total_files += 1;
self.total_size += file.size;
if let Some(ref ext) = file.extension {
*self.extension_counts.entry(ext.clone()).or_insert(0) += 1;
}
self.largest_files.push(file.clone());
self.largest_files.sort_by(|a, b| b.size.cmp(&a.size));
self.largest_files.truncate(10);
}
pub fn display(&self) {
println!("\n{}", "Analysis Summary".bold().cyan());
println!("Total files: {}", self.total_files);
println!("Total size: {}", format_size(self.total_size));
println!("\n{}", "Extension Distribution".bold().cyan());
let mut extensions: Vec<_> = self.extension_counts.iter().collect();
extensions.sort_by(|a, b| b.1.cmp(a.1));
for (ext, count) in extensions.iter().take(10) {
println!(" {}: {} files", ext, count);
}
println!("\n{}", "Largest Files".bold().cyan());
for file in &self.largest_files {
file.display(true);
}
}
}
fn format_size(size: u64) -> String {
const KB: u64 = 1024;
const MB: u64 = KB * 1024;
const GB: u64 = MB * 1024;
if size >= GB {
format!("{:.2} GB", size as f64 / GB as f64)
} else if size >= MB {
format!("{:.2} MB", size as f64 / MB as f64)
} else if size >= KB {
format!("{:.2} KB", size as f64 / KB as f64)
} else {
format!("{} B", size)
}
}
pub fn analyze_directory(
path: &Path,
options: &SearchOptions,
) -> AnalysisResult {
let files = search_files(path, options);
let mut result = AnalysisResult::new();
for file in &files {
result.add_file(file);
}
result
}
pub fn analyze_parallel(
path: &Path,
options: &SearchOptions,
) -> AnalysisResult {
use rayon::prelude::*;
let files = search_files(path, options);
let mut result = AnalysisResult::new();
files.par_iter().for_each(|file| {
// Note: This is a simplified example
// In real code, you'd use Arc<Mutex<AnalysisResult>>
todo!("并行分析逻辑待实现")
});
result
}
2.5 输出模块¶
src/output.rs
Rust
use colored::Colorize;
use indicatif::{ProgressBar, ProgressStyle};
use crate::analyzer::AnalysisResult;
use crate::search::FileEntry;
pub struct OutputOptions {
pub verbose: bool,
pub show_progress: bool,
}
impl OutputOptions {
pub fn new(verbose: bool, show_progress: bool) -> Self {
OutputOptions {
verbose,
show_progress,
}
}
}
pub fn print_files(files: &[FileEntry], options: &OutputOptions) {
let progress = if options.show_progress {
let pb = ProgressBar::new(files.len() as u64);
pb.set_style(
ProgressStyle::default_bar()
.template("{spinner:.green} [{elapsed_precise}] [{bar:40.cyan/blue}] {pos}/{len} ({eta})")
.unwrap()
.progress_chars("#>-")
);
Some(pb)
} else {
None
};
for file in files {
if let Some(ref pb) = progress {
pb.inc(1);
}
file.display(options.verbose);
}
if let Some(pb) = progress {
pb.finish_with_message("Done!");
}
}
pub fn print_analysis(result: &AnalysisResult) {
result.display();
}
pub fn print_error(message: &str) {
eprintln!("{} {}", "Error:".red().bold(), message);
}
pub fn print_warning(message: &str) {
eprintln!("{} {}", "Warning:".yellow().bold(), message);
}
pub fn print_success(message: &str) {
println!("{} {}", "Success:".green().bold(), message);
}
2.6 主程序¶
src/main.rs
Rust
mod cli;
mod search;
mod analyzer;
mod output;
use clap::Parser;
use colored::Colorize;
use cli::Cli;
use search::SearchOptions;
use analyzer::{analyze_directory, analyze_parallel};
use output::{OutputOptions, print_files, print_analysis, print_error, print_success};
#[tokio::main]
async fn main() {
let cli = Cli::parse();
if cli.verbose {
cli.print_info();
}
let path = std::path::Path::new(&cli.path);
if !path.exists() {
print_error(&format!("Path '{}' does not exist", cli.path));
std::process::exit(1);
}
let options = SearchOptions {
extension: cli.extension.clone(),
min_size: cli.min_size,
max_size: cli.max_size,
};
let output_options = OutputOptions::new(cli.verbose, !cli.verbose);
println!("\n{} {}", "Searching in:".cyan(), cli.path);
let result = if cli.parallel {
analyze_parallel(path, &options)
} else {
analyze_directory(path, &options)
};
print_analysis(&result);
print_success(&format!(
"Found {} files",
result.total_files
));
}
🧪 测试¶
3.1 单元测试¶
src/search.rs
Rust
#[cfg(test)]
mod tests {
use super::*;
use std::fs;
use tempfile::TempDir;
#[test]
fn test_file_entry_creation() {
let path = PathBuf::from("test.txt");
let entry = FileEntry::new(path);
assert_eq!(entry.extension, Some("txt".to_string()));
}
#[test]
fn test_size_filtering() {
let options = SearchOptions {
extension: None,
min_size: 100,
max_size: None,
};
let small_file = FileEntry {
path: PathBuf::from("small.txt"),
size: 50,
extension: Some("txt".to_string()),
};
assert!(!small_file.matches(&options));
}
#[test]
fn test_extension_filtering() {
let options = SearchOptions {
extension: Some("rs".to_string()),
min_size: 0,
max_size: None,
};
let rust_file = FileEntry {
path: PathBuf::from("test.rs"),
size: 100,
extension: Some("rs".to_string()),
};
let txt_file = FileEntry {
path: PathBuf::from("test.txt"),
size: 100,
extension: Some("txt".to_string()),
};
assert!(rust_file.matches(&options));
assert!(!txt_file.matches(&options));
}
}
3.2 集成测试¶
tests/integration_test.rs
Rust
use std::fs;
use std::path::PathBuf;
use tempfile::TempDir;
use file_analyzer::search::{SearchOptions, search_files};
#[test]
fn test_search_files() {
let temp_dir = TempDir::new().unwrap();
let test_dir = temp_dir.path();
// Create test files
fs::write(test_dir.join("file1.txt"), "content1").unwrap();
fs::write(test_dir.join("file2.rs"), "content2").unwrap();
fs::write(test_dir.join("file3.txt"), "content3").unwrap();
let options = SearchOptions {
extension: Some("txt".to_string()),
min_size: 0,
max_size: None,
};
let files = search_files(test_dir, &options);
assert_eq!(files.len(), 2);
}
📚 文档¶
4.1 README.md¶
Markdown
# File Analyzer
A powerful command-line tool for analyzing and searching files.
## Features
- Recursive file search
- Filter by file extension
- Filter by file size
- Parallel processing
- Detailed analysis
- Colored output
## Installation
```text
cargo install --path .
```
## Usage
```text
# Search all files
file_analyzer
# Search for specific extension
file_analyzer --extension rs
# Search with size filter
file_analyzer --min-size 1000 --max-size 100000
# Enable parallel processing
file_analyzer --parallel
# Verbose output
file_analyzer --verbose
```
## Examples
```text
# Find all Rust files
file_analyzer --extension rs --parallel
# Find large files
file_analyzer --min-size 1048576
# Detailed analysis
file_analyzer --verbose --parallel
```
## License
MIT
🚀 构建和发布¶
5.1 构建¶
5.2 测试¶
Bash
# Run all tests
cargo test
# Run tests with output
cargo test -- --nocapture
# Run integration tests
cargo test --test integration_test
5.3 发布到 crates.io¶
📝 练习题¶
练习 1:添加新功能¶
练习 2:性能优化¶
练习 3:增强输出¶
💡 最佳实践¶
1. 错误处理¶
Rust
// 好:使用 Result 处理错误
pub fn search_files(path: &Path) -> Result<Vec<FileEntry>, io::Error> {
// ...
}
// 避免:使用 panic!
pub fn search_files(path: &Path) -> Vec<FileEntry> {
// ...
panic!("Error!");
}
2. 文档注释¶
Rust
/// Search for files matching the given criteria.
///
/// # Arguments
///
/// * `path` - The directory to search
/// * `options` - Search options including filters
///
/// # Returns
///
/// A vector of matching file entries
///
/// # Examples
///
/// ```
/// let files = search_files(Path::new("."), &options);
/// ```
pub fn search_files(path: &Path, options: &SearchOptions) -> Vec<FileEntry> {
// ...
}
3. 测试覆盖¶
Rust
// 好:全面的测试覆盖
#[test]
fn test_normal_case() { /* ... */ }
#[test]
fn test_edge_case() { /* ... */ }
#[test]
fn test_error_case() { /* ... */ }
// 避免:缺少测试
🎯 项目总结¶
通过这个实战项目,我们学习了:
- ✅ 项目组织和模块化
- ✅ 命令行参数解析
- ✅ 文件系统操作
- ✅ 并行处理
- ✅ 错误处理
- ✅ 测试编写
- ✅ 文档编写
- ✅ 构建和发布
📚 扩展阅读¶
🎯 本章小结¶
本章通过一个完整的实战项目,综合运用了前面所学的 Rust 知识:
- ✅ 构建了功能完整的 CLI 工具
- ✅ 综合运用了 Rust 的核心特性
- ✅ 学习了项目组织和模块化
- ✅ 掌握了测试和文档编写
- ✅ 了解了发布和部署流程
恭喜! 您已经完成了 Rust 开发教程的学习,现在可以独立开发 Rust 应用程序了!🦀