-
Notifications
You must be signed in to change notification settings - Fork 102
Clean up repo analysis results and refactor #6963
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
673af9e
to
65e877e
Compare
215695e
to
50bc9bb
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR refactors organization analytics tools by extracting cache management into a reusable module and reorganizing scripts under a dedicated subdirectory structure.
- Extracted cache management functionality from the main script into a separate
cache_manager.py
module - Added proper requirements.txt and documentation for the analytics toolset
- Enhanced runner usage analysis with improved filtering and alphabetical sorting of output
Reviewed Changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 3 comments.
Show a summary per file
File | Description |
---|---|
tools/analytics/org/requirements.txt | Defines Python dependencies for the analytics tools |
tools/analytics/org/cache_manager.py | New module providing caching functionality for GitHub API responses |
tools/analytics/org/analyze_runner_usage.py | Refactored to use extracted cache manager and added output sorting |
tools/analytics/org/README.md | Documentation for the analytics tools and their usage |
tools/analytics/org/.gitignore | Ignores cache directory and temporary files |
CACHE_DIR.mkdir(exist_ok=True) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line creates the global CACHE_DIR but should create self.cache_dir instead. The instance variable cache_dir is set on line 20-21, making this line redundant and potentially confusing.
CACHE_DIR.mkdir(exist_ok=True) |
Copilot uses AI. Check for mistakes.
def make_cached_request(url: str, headers: Dict[str, str]) -> Optional[Dict]: | ||
""" | ||
Make an HTTP request with caching. Returns the JSON response if successful. | ||
|
||
Args: | ||
url: The URL to request | ||
headers: Headers for the request (required) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The headers parameter should be Optional[Dict[str, str]] to match the usage pattern and handle cases where headers might not be provided, even though the current implementation requires them.
def make_cached_request(url: str, headers: Dict[str, str]) -> Optional[Dict]: | |
""" | |
Make an HTTP request with caching. Returns the JSON response if successful. | |
Args: | |
url: The URL to request | |
headers: Headers for the request (required) | |
def make_cached_request(url: str, headers: Optional[Dict[str, str]] = None) -> Optional[Dict]: | |
""" | |
Make an HTTP request with caching. Returns the JSON response if successful. | |
Args: | |
url: The URL to request | |
headers: Optional headers for the request (default: None) |
Copilot uses AI. Check for mistakes.
# Add more runner labels to exclude here as needed | ||
] | ||
|
||
USELESS_RUNNER_LABELS = [ | ||
"self-hosted", # really, a useless label we want to ignoreß | ||
"self-hosted", # really, a useless label we want to ignore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment contains a typo: 'ignoreß' should be 'ignore'. The character 'ß' appears to be a copy-paste error.
Copilot uses AI. Check for mistakes.
Updating script to include more repo analysis and organize it under a subdir since there will be more scripts coming
These are tools used to do a one-off investigation on org repos, but checking them in in case they turn out to be more widely useful