Photon Flux Calculator: Project Setup & Infrastructure

by Pedro Alvarez 55 views

Hey guys! Let's dive into setting up a solid project structure and infrastructure for our photon flux calculator. A well-organized project is key to maintainability, scalability, and overall sanity. So, let’s get our hands dirty and build a robust foundation!

Objective

Our main goal here is to create a Python project that’s not only functional but also a pleasure to work with. We want clear module boundaries, comprehensive testing, and an automated workflow. Think of it as building a house – you need a strong foundation before you start adding the walls and roof.

Requirements

Here’s a checklist of what we need to get done:

  • [x] Create basic project structure
  • [x] Set up pyproject.toml for modern Python packaging
  • [x] Configure pytest for testing
  • [x] Set up GitHub Actions for CI/CD
  • [x] Create requirements.txt and requirements-dev.txt
  • [x] Add pre-commit hooks for code quality
  • [x] Create comprehensive README.md

Let’s break these down and see how we can nail each one.

Project Structure

First up, the project structure. This is how we're going to organize our files and directories. A clean structure makes it easier for us and anyone else to understand where things live. Here’s the layout we’re aiming for:

photon-flux-calculator/
β”œβ”€β”€ src/
β”‚   └── photon_flux/
β”‚       β”œβ”€β”€ __init__.py
β”‚       β”œβ”€β”€ astronomy/
β”‚       β”‚   β”œβ”€β”€ __init__.py
β”‚       β”‚   β”œβ”€β”€ sun.py
β”‚       β”‚   β”œβ”€β”€ moon.py
β”‚       β”‚   └── stars.py
β”‚       β”œβ”€β”€ sensors/
β”‚       β”‚   β”œβ”€β”€ __init__.py
β”‚       β”‚   β”œβ”€β”€ base.py
β”‚       β”‚   β”œβ”€β”€ ar0235.py
β”‚       β”‚   └── ar0822.py
β”‚       β”œβ”€β”€ optics/
β”‚       β”‚   β”œβ”€β”€ __init__.py
β”‚       β”‚   └── lens.py
β”‚       β”œβ”€β”€ environment/
β”‚       β”‚   β”œβ”€β”€ __init__.py
β”‚       β”‚   β”œβ”€β”€ atmosphere.py
β”‚       β”‚   └── light_pollution.py
β”‚       β”œβ”€β”€ physics/
β”‚       β”‚   β”œβ”€β”€ __init__.py
β”‚       β”‚   └── photon_calculations.py
β”‚       └── cli.py
β”œβ”€β”€ tests/
β”‚   β”œβ”€β”€ test_astronomy/
β”‚   β”œβ”€β”€ test_sensors/
β”‚   β”œβ”€β”€ test_optics/
β”‚   β”œβ”€β”€ test_environment/
β”‚   └── test_physics/
β”œβ”€β”€ configs/
β”‚   β”œβ”€β”€ sensors/
β”‚   β”‚   β”œβ”€β”€ ar0235.yaml
β”‚   β”‚   └── ar0822.yaml
β”‚   └── lenses/
β”‚       β”œβ”€β”€ 8mm_f1.8.yaml
β”‚       └── 12mm_f1.6.yaml
β”œβ”€β”€ docs/
β”œβ”€β”€ .github/
β”‚   └── workflows/
β”‚       └── ci.yml
β”œβ”€β”€ pyproject.toml
β”œβ”€β”€ README.md
└── LICENSE

Let's walk through the key directories:

src/photon_flux/

This is where the heart of our application lives. We've broken it down into several submodules:

  • astronomy/: This submodule handles calculations related to celestial bodies like the sun, moon, and stars. It's where we'll implement functions to determine the position and brightness of these objects. Think about implementing functions for calculating the altitude and azimuth of celestial bodies, as well as their luminous intensity. The sun.py module might include functions for calculating solar irradiance, while moon.py could handle lunar phases and brightness. For stars.py, we might include functions to fetch star data from catalogs or calculate their visibility based on atmospheric conditions. We need to make sure that these calculations are as accurate as possible, so integrating with established astronomical libraries could be a good idea.

  • sensors/: Here, we define how different sensors are modeled. Each sensor (like ar0235.py and ar0822.py) will have its own module, inheriting from a base class (base.py). The base class will define common interfaces, such as methods for calculating quantum efficiency, noise characteristics, and sensitivity. Each specific sensor module then implements these methods based on the sensor's datasheet. For example, the ar0235.py module will contain the specific parameters for the AR0235 sensor, like its pixel size, quantum efficiency curve, and dark current. This modular approach allows us to easily add new sensors without modifying existing code, making the project scalable and maintainable. Proper error handling and validation of sensor parameters are crucial here to ensure the reliability of the calculations.

  • optics/: This is where we handle lens-related calculations. The lens.py module will contain classes and functions for modeling lenses, including their focal length, aperture, and transmission characteristics. We might also include functions to calculate the image scale and field of view. This module is crucial for understanding how light is focused onto the sensor. We might want to consider adding support for different lens types, such as achromatic lenses or telephoto lenses, in the future. The use of configuration files (as seen in the configs/lenses/ directory) allows us to easily define and switch between different lens setups without modifying the code.

  • environment/: This submodule deals with environmental factors like atmospheric conditions and light pollution. The atmosphere.py module might contain functions to calculate atmospheric transmission and scattering, while light_pollution.py could model the effects of artificial light on observations. We need to account for factors like airmass, humidity, and aerosol content to get accurate photon flux estimates. We could integrate with external APIs or databases to fetch real-time weather data or light pollution maps. This submodule is essential for simulating realistic observation conditions and understanding how they affect the signal.

  • physics/: This is the powerhouse where the core photon calculations happen. The photon_calculations.py module will contain the functions for calculating photon flux based on the inputs from the other modules. This includes accounting for the spectral radiance of the source, the transmission of the optics and atmosphere, and the sensitivity of the sensor. We might also include functions for calculating signal-to-noise ratio and exposure time. Accuracy is paramount here, so thorough testing and validation against known physical models are crucial. Consider adding unit tests for various scenarios, such as different source spectra and atmospheric conditions, to ensure the calculations are correct.

  • cli.py: This is our command-line interface, making the calculator accessible from the terminal. We'll use a library like argparse or click to define the command-line arguments and options. This module serves as the entry point for users who want to run the calculator without diving into the code. Think about making the CLI user-friendly by providing clear help messages and sensible defaults.

tests/

Each submodule in src/photon_flux/ gets its own test directory here. For example, tests/test_astronomy/ will contain tests for the astronomy/ module. This keeps our tests organized and easy to find. We’ll be using pytest for our testing framework.

configs/

This directory stores configuration files, like the YAML files for sensor and lens parameters. This allows us to define different hardware setups without hardcoding values in our code. It's a best practice to separate configuration from code, making the application more flexible and easier to maintain. For example, the ar0235.yaml file will contain the specifications for the AR0235 sensor, such as its pixel size, quantum efficiency, and dark current. Similarly, 8mm_f1.8.yaml might describe an 8mm lens with an f/1.8 aperture.

docs/

Documentation is crucial! This directory will house our project documentation, likely built with Sphinx. We’ll document our modules, classes, and functions, as well as provide usage examples and tutorials. Good documentation is essential for making the project accessible to others (and for reminding ourselves how things work months down the line!). Consider including a detailed explanation of the photon flux calculation process, as well as the assumptions and limitations of the model.

.github/workflows/

This is where our GitHub Actions workflows live. We’ll have a ci.yml file to define our continuous integration pipeline, which will automatically run tests whenever we push code. Automating tests and checks is a cornerstone of modern software development, ensuring that the codebase remains healthy and reliable. Our CI pipeline might include steps for running tests, linting, and checking code style. We might also add steps to build and deploy the documentation automatically.

Other Files

  • pyproject.toml: This file is the modern way to manage Python project metadata, dependencies, and build settings. It's the successor to setup.py and provides a more standardized and readable format. We'll use it to specify our project dependencies, testing framework, and code formatting tools.
  • README.md: This is the entry point for anyone visiting our project on GitHub. It should provide a clear overview of the project, instructions for installation and usage, and any other relevant information. A well-written README is crucial for attracting users and contributors.
  • LICENSE: This file specifies the license under which the project is released. It's important to choose a license that matches your goals for the project, whether it's open-source or proprietary.

Setting Up pyproject.toml

pyproject.toml is where we declare our project metadata and dependencies. It’s the modern way to handle Python packaging. Here’s a basic example:

[build-system]
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"

[project]
name = "photon-flux-calculator"
version = "0.1.0"
description = "A Python package for calculating photon flux."
authors = [{ name = "Your Name", email = "[email protected]" }]
dependencies = [
    "numpy",
    "astropy",
    "PyYAML"
]

[project.optional-dependencies]
dev = [
    "pytest",
    "pre-commit",
    "pylint"
]

[project.scripts]
photon-flux = "photon_flux.cli:main"

[tool.pytest.ini_options]
addopts = ["-v", "--cov=photon_flux", "--cov-report term-missing"]

Let's break down the key sections:

  • [build-system]: Specifies the build system requirements. We're using setuptools and wheel, which are standard tools for building Python packages.
  • [project]: Contains project metadata like name, version, description, authors, and dependencies. We list the core dependencies here, such as numpy, astropy, and PyYAML. These are the packages that our calculator needs to run.
  • [project.optional-dependencies]: Defines optional dependencies, like those for development (testing, linting, etc.). Here, we include pytest, pre-commit, and pylint. These are tools that help us develop and maintain the project but aren't required for users to run the calculator.
  • [project.scripts]: Defines command-line scripts. This allows us to run our calculator from the terminal using the photon-flux command, which maps to the main function in our photon_flux.cli module.
  • [tool.pytest.ini_options]: Specifies options for pytest, such as verbosity (-v), code coverage reporting (--cov), and coverage output (--cov-report). These settings help us ensure that our tests are thorough and that we're covering as much of the codebase as possible.

Configuring pytest for Testing

Testing is super important! We’ll use pytest because it’s flexible and easy to use. We've already added pytest as a development dependency in pyproject.toml. Now, let’s create some basic tests. For example, in tests/test_physics/test_photon_calculations.py:

import pytest
from photon_flux.physics import photon_calculations

def test_calculate_photon_flux():
    # Example test case
    wavelength = 550  # nm
    power = 1  # Watt
    result = photon_calculations.calculate_photon_flux(wavelength, power)
    assert result > 0

This is a very basic test, but it shows the structure. We import the function we want to test (calculate_photon_flux) and then write a test case. We assert that the result is greater than zero, which is a basic sanity check. More comprehensive tests would involve comparing the result against known values or checking edge cases.

To run tests, just navigate to the project root in your terminal and run pytest. pytest will automatically discover and run all tests in the tests/ directory.

Setting Up GitHub Actions for CI/CD

Continuous Integration (CI) is a practice where we automatically run tests whenever code is pushed to a repository. This helps us catch errors early and often. GitHub Actions makes this easy. Create a file at .github/workflows/ci.yml:

name: CI

on:
  push:
    branches: [ main ]
  pull_request:
    branches: [ main ]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - name: Set up Python 3.10
        uses: actions/setup-python@v3
        with:
          python-version: "3.10"
      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip
          pip install -e .[dev]
      - name: Lint with pylint
        run: |
          pylint src/photon_flux
      - name: Test with pytest
        run: |
          pytest

Here’s what this workflow does:

  • name: CI: Gives our workflow a name.
  • on: ...: Defines when the workflow should run. In this case, on pushes to the main branch and on pull requests targeting the main branch.
  • jobs: ...: Defines the jobs that will run. We have one job, build.
  • runs-on: ubuntu-latest: Specifies the operating system to run the job on.
  • steps: ...: Defines the steps that will be executed in the job.
    • uses: actions/checkout@v3: Checks out our code.
    • uses: actions/setup-python@v3: Sets up Python 3.10.
    • Install dependencies: Installs our project dependencies, including the development dependencies (specified by [dev] in pyproject.toml).
    • Lint with pylint: Runs pylint to check our code for style and potential errors.
    • Test with pytest: Runs our tests using pytest.

Now, every time we push code or create a pull request, GitHub Actions will automatically run our tests and linting checks. This gives us immediate feedback on the quality of our code.

Creating requirements.txt and requirements-dev.txt

While pyproject.toml is the modern way to manage dependencies, some tools still rely on requirements.txt. We can generate these files from our pyproject.toml:

First, install pip-tools:

pip install pip-tools

Then, generate requirements.txt:

pip-compile pyproject.toml

And requirements-dev.txt:

pip-compile --extra dev pyproject.toml

These files list our project dependencies in a format that older tools can understand. However, it's generally recommended to rely on pyproject.toml for dependency management in modern Python projects.

Adding Pre-commit Hooks for Code Quality

Pre-commit hooks are scripts that run automatically before you commit code. They can help us catch style issues, run tests, and enforce other code quality checks. We've already added pre-commit as a development dependency in pyproject.toml. Now, let’s configure it.

Create a file at .pre-commit-config.yaml:

repos:
-   repo: https://github.com/pre-commit/pre-commit-hooks
    rev: v4.4.0
    hooks:
    -   id: trailing-whitespace
    -   id: end-of-file-fixer
    -   id: check-yaml
    -   id: check-added-large-files
-   repo: https://github.com/psf/black
    rev: 23.3.0
    hooks:
    -   id: black
-   repo: https://github.com/PyCQA/pylint
    rev: v2.17.4
    hooks:
    -   id: pylint
        args: [
          "--disable=missing-module-docstring,missing-class-docstring,missing-function-docstring",
        ]

This configuration uses several hooks:

  • trailing-whitespace: Removes trailing whitespace.
  • end-of-file-fixer: Ensures files end with a newline.
  • check-yaml: Checks YAML files for syntax errors.
  • check-added-large-files: Prevents large files from being committed.
  • black: Formats our code using the Black code formatter.
  • pylint: Runs Pylint to check our code for style and potential errors. We disable some docstring checks because we might not want to enforce docstrings for every module, class, and function.

To install the hooks, run:

pre-commit install

Now, every time you try to commit code, these hooks will run automatically. If any hook fails, the commit will be aborted, and you’ll need to fix the issues before committing again. This helps us maintain a consistent code style and catch potential errors early.

Creating a Comprehensive README.md

A good README.md is essential for making our project accessible to others. It should include:

  • A clear description of the project.
  • Instructions for installation and usage.
  • Examples of how to use the calculator.
  • Information about the project structure and modules.
  • Contribution guidelines.
  • License information.

Here’s a basic example:

# Photon Flux Calculator

A Python package for calculating photon flux for astronomical observations.

## Installation

```bash
pip install -e .

Usage

photon-flux --help

Project Structure

See the Project Structure section for details.

Contributing

See CONTRIBUTING.md for guidelines.

License

MIT


This is just a starting point. We should add more details as the project evolves, such as examples of how to use the calculator with different sensors and lenses, and explanations of the underlying physics and calculations.

## Acceptance Criteria

Let’s make sure we’ve met our goals:

-   [x] Project can be installed with `pip install -e .`
-   [x] All tests pass with `pytest`
-   [x] Pre-commit hooks run on commit
-   [x] CI pipeline runs tests on push

If all these criteria are met, we’ve successfully set up our project structure and infrastructure! We now have a solid foundation to build our photon flux calculator on.

## Conclusion

Setting up a project structure and infrastructure might seem like a lot of work upfront, but it pays off big time in the long run. A well-organized project is easier to maintain, test, and extend. By using modern tools like `pyproject.toml`, `pytest`, GitHub Actions, and pre-commit hooks, we’ve created a robust and efficient development workflow. Now, let's get to the fun part – building the calculator itself!