Passing variables between modules
Date: 2025 04-April 23
Query 0:
tokens seem like a secret
which is why i have the token getter here
But it would also make sense to package the token retreiver with other api calls.
This SecretsYaml code really just loads the config
What i really need to know is to how to effectively pass the config throughout my codebase between files
My ideas:
A main main scripts that orchestrates everything
a PassIn class, which has class methods like pass_in_config_object(cls,config_object): cls.config_object = config_object, so that instances are known to each other as attributes. various classes can know these PassIn mathods by using inheritance, like ApiCall(PassIn)
What is the modern pythonic way to pass variables between modules? is state the right term? Also, would i say passing between modules, between scripts, or between files?
ChatGPT0:
You're asking a great and very real-world question here, George — one that a lot of Python developers end up wrestling with: "How do I pass shared configuration or context (like secrets) across different parts of a multi-file project in a clean, Pythonic way?"
Let's break it down in pieces — including your ideas — and then summarize the most idiomatic modern approach.
🔁 What You're Trying to Do
You're essentially solving for:
“I want multiple parts of my codebase (in multiple files/modules) to have access to a shared, structured config object (like secrets), without needing to pass it manually to every function.”
Yes, this is about managing shared state across modules. And state is the right word, more specifically: application state or runtime state.
✅ What You're Doing Right
-
Centralizing config parsing (
SecretsYaml
) -
Thinking about reuse, especially for tokens
-
Wanting modularity across files
-
Using objects instead of globals ✅
🤔 What to Call It?
-
“Between files” is fine casually
-
“Between modules” is the most accurate term in Python
-
“Between scripts” usually refers to command-line entry points or processes, so it might imply subprocesses or separate runs.
✅ Use “between modules” when you mean shared code across .py
files.
💡 Modern Pythonic Patterns
Let’s go through the idioms, including yours:
1. Main-Driven Orchestration (✔ Your Idea)
Use a main script that loads config and passes it into other classes.
This is highly Pythonic and mirrors how frameworks like FastAPI or Typer work.
# main.py
from config import load_config
from api.eds import EdsClient
from api.rjn import RjnClient
def main():
config = load_config()
eds = EdsClient(config)
rjn = RjnClient(config)
This is the cleanest and most testable approach. ✅
2. Singleton Config or Registry Pattern
A lightweight global-ish pattern:
# settings.py
_config = None
def set_config(cfg): global _config; _config = cfg
def get_config(): return _config
# elsewhere
from settings import get_config
cfg = get_config()
⚠ Use this if you must get global-style access — not great for testability or transparency.
3. Your PassIn Pattern via Inheritance
class PassIn:
_config = None
@classmethod
def set_config(cls, cfg): cls._config = cfg
@classmethod
def get_config(cls): return cls._config
This is creative and works — you're essentially doing a poor man’s DI container. If you prefer it over get_config()
modules, go for it.
💡 One tweak: rather than inheritance, you might have a ConfigurableMixin
so other classes can opt in:
class ConfigurableMixin:
config = None
@classmethod
def set_config(cls, cfg): cls.config = cfg
4. Dependency Injection (DI) Container
Advanced: use libraries like dependency-injector
.
from dependency_injector import containers, providers
class AppContainer(containers.DeclarativeContainer):
config = providers.Configuration()
eds_client = providers.Factory(EdsClient, config=config)
Great for large apps but may be overkill unless you're building a framework or microservices.
🧠 TL;DR – What I Recommend for You
Given what you’re doing now and how you’re thinking:
# main.py (or __main__.py)
from src.config import SecretsYaml
from src.api.eds import EdsClient
from src.api.rjn import RjnClient
def main():
config_obj = SecretsYaml.load_config()
eds = EdsClient(config_obj['eds_apis'])
rjn = RjnClient(config_obj['rjn_api'])
token_eds = eds.get_token()
token_rjn = rjn.get_token()
-
Keep SecretsYaml for config loading
-
Write clients that only take the data they need
-
Avoid hidden state unless you’re writing a library or plugin system
-
Avoid globals unless absolutely necessary (and encapsulate them)
If you want, I can sketch out how to restructure SecretsYaml
to just load config, and how to write EdsClient
and RjnClient
that encapsulate their own auth logic. Want that example?
Query 1:
My only concern with the main script is that it does not allow for user interaction for an enterprise grade software. Ongoing interaction, as opposed to calling a set orchestration.
ChatGPT 1:
Ahh — got it. You're not just writing a single-shot CLI script or automation tool; you're building something interactive, possibly long-running, possibly event- or command-driven. That changes things a lot.
Now we’re in the territory of enterprise app architecture, where the app doesn’t just “run and exit,” but runs, responds, and evolves during execution — and the config/state needs to remain accessible across modules and time.
Let’s break it down.
🎯 Your Core Need
I need configuration and shared state to persist and be accessible across modules and runtime, for ongoing interaction — like commands, menu choices, or GUI actions.
✅ Good News
You're thinking correctly — a main orchestrator is great for startup, but doesn't limit you. Think of it as your entry point, not a one-time runner.
🔁 Your Modern Options for Interactive Apps
Depending on your interface and goals:
1. Central “App State” Object (Highly Recommended)
You create one top-level object — like an AppState
, Context
, or Session
class — and pass it around (or make it accessible) to other components.
# app/state.py
class AppState:
def __init__(self, config):
self.config = config
self.token_cache = {}
def get_eds_client(self):
return EdsClient(self.config['eds_apis'])
def get_rjn_client(self):
return RjnClient(self.config['rjn_api'])
# possibly attach loggers, ui, user, etc.
Then your main loop or controller can drive the interaction.
# main.py
def main():
config = SecretsYaml.load_config()
app = AppState(config)
while True:
cmd = input(">>> ")
if cmd == "get_token":
token, _ = app.get_eds_client().get_token()
print(f"Token: {token}")
elif cmd == "exit":
break
✅ This lets you keep an interactive loop and still cleanly organize state.
2. Use a Class to Model Your CLI / GUI / Service
You could encapsulate interaction in a reusable AppRunner
class that holds config and logic:
class MyApp:
def __init__(self, config):
self.state = AppState(config)
def run(self):
while True:
choice = input("Action? ")
self.route(choice)
def route(self, choice):
if choice == "eds":
token, _ = self.state.get_eds_client().get_token()
print(token)
This is very close to how GUI apps, TUI apps, and CLI frameworks (like Typer or Click) are structured.
3. Leverage a Framework If Needed
-
For CLI/TUI:
Typer
orprompt_toolkit
for interactivity -
For GUI: Qt (via PySide6) or tkinter
-
For API servers: FastAPI
All of these let you inject state/config into the core app object or via DI containers.
🧠 State Management Pattern
In your case, think of config as part of a wider "runtime context" — so instead of just passing config, model your context as a class:
class RuntimeContext:
def __init__(self, config):
self.config = config
self.session = {}
self.user = None
self.logger = ...
self.clients = {} # Maybe cache API clients
Then pass ctx
to functions or modules as needed, or provide a static accessor for truly shared state.
✅ Final Thoughts
-
You can have interactivity and structure.
-
Use a main entry point to create and configure your runtime state, but not to handle everything.
-
Drive ongoing interaction through a loop or controller class, using a
RuntimeContext
,AppState
, or whatever feels natural.
Would you like a refactor example of your current script using this AppState
pattern?