Friday, May 30, 2025

Rakshas' law on nuclear automata and quantum relativity - 2 Versions! Updated for General Relativity/Quantum Mechanical exploration!

Radical! Nearest neighbour tracing in nuclear automata. Below the program is one that incorporates general relativity and now gravity time-dialation ratio/scalar curvature as a k-body factor due to certain chemical stoichastic reporting systems you may choose to use either, this new update also explores a hypothetical relationship between the gravitational curvature parameter and a quantum mechanical property, this might elucidate how such a "bridge" might be conceptually built, even if not physically validated :-)

 

For the second program please search, I postulate 

 

import numpy as np
import matplotlib.pyplot as plt
import pubchempy as pcp
import requests
import json
import time
import re

# --- Scientific Constants ---
# Nuclear Physics Constants
CURIE_TO_BQ = 3.7e10           # Conversion factor: 1 Curie = 3.7 x 10^10 Becquerels (disintegrations per second)
LN2 = np.log(2)                # Natural logarithm of 2, used in half-life calculations

# Blackbody Radiation Constants
STEFAN_BOLTZMANN_CONSTANT = 5.670374419e-8 # Stefan-Boltzmann constant in W/(m^2·K^4)
PLANCK_CONSTANT = 6.62607015e-34 # Planck's constant in J·s
BOLTZMANN_CONSTANT = 1.380649e-23 # Boltzmann constant in J/K
SPEED_OF_LIGHT = 2.99792458e8 # Speed of light in a vacuum in m/s
WIEN_DISPLACEMENT_CONSTANT = 2.897771955e-3 # Wien's displacement law constant in m·K

# --- Manual Nuclear Data for Gamma Emission ---
# This dictionary stores known half-lives and gamma emission properties for isotopes.
# IMPORTANT: These values should be verified from reliable nuclear data sources (e.g., NNDC).
# PubChem provides general information, but specific gamma yields are best from dedicated databases.
MANUAL_GAMMA_DATA = {
    "Cesium-137": {
        "half_life_days": 30.08 * 365.25, # Half-life in days (approx 30.08 years)
        "gamma_energies_MeV": [0.662],
        "gamma_yields_per_decay": [0.851] # Gamma yield per decay (from Ba-137m isomer)
    },
    "Cobalt-60": {
        "half_life_days": 1925.21, # Half-life in days
        "gamma_energies_MeV": [1.173, 1.332],
        "gamma_yields_per_decay": [0.999, 0.999] # Approximately 1 gamma per decay at each energy
    },
    "Iodine-131": {
        "half_life_days": 8.02, # Half-life in days
        "gamma_energies_MeV": [0.364],
        "gamma_yields_per_decay": [0.817] # Gamma yield per decay
    },
    "Potassium-40": {
        "half_life_days": 1.251e9 * 365.25, # Half-life in days (~1.251 billion years)
        "gamma_energies_MeV": [1.4608],
        "gamma_yields_per_decay": [0.1067] # ~10.67% of K-40 decays via electron capture
    },
    # Feel free to add more isotopes here with their respective data
}

# --- PubChem Helper Functions (for half-life lookup if not in manual data) ---
# These functions attempt to fetch half-life data from PubChem,
# but their parsing can be fragile due to variations in PubChem's data structure.
def _get_compound_cid_pubchem(compound_name):
    """
    Searches PubChem for a compound's CID (Compound ID) by name.
    Returns the CID if found, otherwise None.
    """
    try:
        compounds = pcp.get_compounds(compound_name, 'name')
        return compounds[0].cid if compounds else None
    except Exception:
        return None

def _fetch_half_life_from_pubchem(cid):
    """
    Fetches half-life data from PubChem using a compound's CID.
    Attempts to parse numerical value and unit from the returned text.
    Returns (value, unit) tuple if successful, otherwise (None, None).
    """
    base_url = "https://pubchem.ncbi.nlm.nih.gov/rest/pug_view/data/compound/"
    url = f"{base_url}cid/{cid}/JSON/"
    try:
        response = requests.get(url, timeout=5)
        response.raise_for_status() # Raise an exception for HTTP errors
        data = response.json()

        search_sections = ['Atomic Mass, Half Life, and Decay', 'Chemical and Physical Properties', 'Depositor-Supplied']
        for record in data.get('Record', {}).get('Section', []):
            if record.get('TOCHeading') in search_sections:
                for prop_section in record.get('Section', []):
                    if 'half-life' in prop_section.get('TOCHeading', '').lower():
                        for item in prop_section.get('Information', []):
                            for value_dict in item.get('Value', {}).get('StringWithMarkup', []):
                                text = value_dict.get('String')
                                if text:
                                    # Use regex to find numerical value and unit
                                    match = re.match(r"(\d+\.?\d*(?:e[+-]?\d+)?)\s*(seconds|minutes|hours|days|years|s|min|hr|d|y)", text.lower())
                                    if match:
                                        return float(match.group(1)), match.group(2)
        return None, None
    except Exception:
        return None, None

def _convert_half_life_to_days(value, unit):
    """
    Converts a half-life value from various units to days.
    Returns the half-life in days, or None if the unit is unrecognized.
    """
    if unit in ['s', 'seconds']: return value / (24 * 3600)
    if unit in ['min', 'minutes']: return value / (24 * 60)
    if unit in ['hr', 'hours']: return value / 24
    if unit in ['d', 'days']: return value
    if unit in ['y', 'years']: return value * 365.25
    return None

# --- Core Radiation Calculation Functions ---
def calculate_decay_constant(half_life_days):
    """
    Calculates the radioactive decay constant (lambda) from the half-life.
    Returns the decay constant in inverse seconds (s^-1).
    If half_life_days is non-positive, returns 0 (implying a stable isotope).
    """
    if half_life_days is None or half_life_days <= 0:
        return 0
    half_life_seconds = half_life_days * 24 * 3600 # Convert days to seconds
    return LN2 / half_life_seconds

def calculate_activity_at_time(initial_activity_Bq, decay_constant, time_elapsed_seconds):
    """
    Calculates the activity (number of decays per second) of a radioactive source
    at a given time, using the decay law (N = N0 * e^(-lambda*t)).
    Returns activity in Becquerels (Bq).
    """
    if decay_constant == 0:
        return initial_activity_Bq # For stable isotopes or infinite half-life
    return initial_activity_Bq * np.exp(-decay_constant * time_elapsed_seconds)

def get_gamma_curies_single_point(isotope_name, initial_activity_Bq, time_elapsed_days):
    """
    Calculates the total gamma curies emitted by an isotope at a specific time.
    It first tries to get half-life from the `MANUAL_GAMMA_DATA`, then falls back to PubChem.
    Returns the gamma activity in Curies (Ci), or None if half-life data isn't found.
    """
    half_life_days = None
    if isotope_name in MANUAL_GAMMA_DATA:
        half_life_days = MANUAL_GAMMA_DATA[isotope_name]["half_life_days"]
    else:
        # If not in manual data, try PubChem
        cid = _get_compound_cid_pubchem(isotope_name)
        if cid:
            hl_val, hl_unit = _fetch_half_life_from_pubchem(cid)
            if hl_val:
                half_life_days = _convert_half_life_to_days(hl_val, hl_unit)

    if half_life_days is None:
        return None # Could not determine half-life

    gamma_yields = MANUAL_GAMMA_DATA.get(isotope_name, {}).get("gamma_yields_per_decay", [0])
    total_gamma_yield_per_decay = sum(gamma_yields)

    decay_constant = calculate_decay_constant(half_life_days)
    time_elapsed_seconds = time_elapsed_days * 24 * 3600

    current_activity_Bq = calculate_activity_at_time(initial_activity_Bq, decay_constant, time_elapsed_seconds)
    total_gamma_disintegrations_per_second = current_activity_Bq * total_gamma_yield_per_decay
    gamma_curies = total_gamma_disintegrations_per_second / CURIE_TO_BQ
    
    return gamma_curies

def calculate_blackbody_spectral_radiance(wavelength_m, temperature_K):
    """
    Calculates the spectral radiance of a blackbody at a given wavelength and temperature,
    using Planck's Law.
    Returns spectral radiance in Watts per steradian per square meter per meter of wavelength (W/(m³·sr)).
    Returns 0.0 if temperature or wavelength are non-positive or if the exponent is too large.
    """
    if temperature_K <= 0 or wavelength_m <= 0:
        return 0.0

    term1 = (2 * PLANCK_CONSTANT * SPEED_OF_LIGHT**2) / (wavelength_m**5)
    term2_exp_arg = (PLANCK_CONSTANT * SPEED_OF_LIGHT) / (wavelength_m * BOLTZMANN_CONSTANT * temperature_K)
    
    # Check to prevent numerical overflow for very large exponent arguments
    if term2_exp_arg > 700:
        return 0.0 # Effectively zero emission at this wavelength/temperature
        
    spectral_radiance = term1 / (np.exp(term2_exp_arg) - 1)
    return spectral_radiance

# --- Utility Function for Generating Bell Curve Data ---
def generate_normal_distribution_series(mean, std_dev, num_points=200, spread_factor=3):
    """
    Generates a series of values that follow a normal (bell curve) distribution.
    This is used to create a distribution for a parameter, like measurement time.
    
    Args:
        mean (float): The mean (average) of the distribution.
        std_dev (float): The standard deviation of the distribution.
        num_points (int): The number of data points to generate.
        spread_factor (int): How many standard deviations away from the mean
                              the series should span (e.g., 3 for 3-sigma range).
    
    Returns:
        tuple: A tuple containing:
            - series_values (numpy.array): The generated data points.
            - pdf_values (numpy.array): The probability density function (PDF) values
                                        corresponding to `series_values`.
    """
    start = mean - spread_factor * std_dev
    end = mean + spread_factor * std_dev
    if start < 0:
        start = 0 # Ensure time or other physical quantities are non-negative
    
    series_values = np.linspace(start, end, num_points)
    pdf_values = (1 / (std_dev * np.sqrt(2 * np.pi))) * \
                 np.exp(-0.5 * ((series_values - mean) / std_dev)**2)
    return series_values, pdf_values

# --- Analysis Functions for "Nearest Neighbor Boundaries" and "Displacement" ---

def analyze_gamma_curies_nearest_neighbor_boundaries(isotope_name, initial_activity_Bq, 
                                                     mean_time_days, std_dev_time_days, 
                                                     num_points=200, derivative_threshold_abs=None):
    """
    Calculates gamma curies over a series of measurement times sampled from a bell curve.
    It identifies "nearest neighbor boundaries" by highlighting regions where the
    absolute rate of change (derivative) of gamma curies is above a given threshold.
    This represents where the decay process is changing most rapidly.
    
    Args:
        isotope_name (str): The name of the radioactive isotope (e.g., "Cesium-137").
        initial_activity_Bq (float): The initial activity of the isotope in Becquerels (Bq).
        mean_time_days (float): The mean (average) of the bell curve distribution for measurement times in days.
        std_dev_time_days (float): The standard deviation of the bell curve for measurement times in days.
        num_points (int): The number of time points to simulate.
        derivative_threshold_abs (float, optional): The absolute threshold for the derivative (Ci/day).
                                                    If None, the program will auto-calculate a threshold
                                                    (e.g., 10% of the maximum absolute derivative).
    """
    print(f"\n--- Identifying 'Nearest Neighbor Boundaries' in Gamma Curies ---")
    print(f"  Isotope: {isotope_name}, Initial Activity: {initial_activity_Bq:.2e} Bq")
    print(f"  Time Distribution: Mean={mean_time_days:.2f} days, Std Dev={std_dev_time_days:.2f} days")
    print(f"  'Nearest Neighbor Boundaries' are regions where the decay rate changes significantly.")

    # 1. Generate time series based on a normal distribution
    time_series_days, pdf_values = generate_normal_distribution_series(
        mean_time_days, std_dev_time_days, num_points=num_points)
    
    # 2. Calculate gamma curies for each time point
    curies_series = []
    for t_days in time_series_days:
        curies = get_gamma_curies_single_point(isotope_name, initial_activity_Bq, t_days)
        curies_series.append(curies if curies is not None else np.nan) # Store None as NaN
    curies_series = np.array(curies_series) # Convert to NumPy array for numerical operations

    # Handle cases where half-life was not found (curies_series will be all NaN)
    if np.all(np.isnan(curies_series)):
        print(f"  Calculation failed. Please ensure '{isotope_name}' has half-life data.")
        return

    # 3. Calculate the numerical derivative (rate of change) of gamma curies with respect to time
    # np.gradient provides a robust way to estimate the derivative for discrete data.
    derivative_curies = np.gradient(curies_series, time_series_days)

    # 4. Determine or use a threshold for identifying "boundaries"
    if derivative_threshold_abs is None:
        # If no threshold is provided, auto-calculate it as 10% of the maximum absolute derivative
        # This highlights the top 10% steepest changes.
        max_abs_deriv = np.max(np.abs(derivative_curies[np.isfinite(derivative_curies)]))
        derivative_threshold_abs = max_abs_deriv * 0.1
        print(f"  Auto-calculated derivative threshold: {derivative_threshold_abs:.2e} Ci/day")
    else:
        print(f"  Using provided derivative threshold: {derivative_threshold_abs:.2e} Ci/day")

    # 5. Identify points where the absolute derivative exceeds the threshold
    boundary_indices = np.where(np.abs(derivative_curies) > derivative_threshold_abs)
    boundary_times = time_series_days[boundary_indices]
    boundary_curies = curies_series[boundary_indices]

    # --- Plotting Results ---
    plt.figure(figsize=(12, 7))
    plt.plot(time_series_days, curies_series, label='Gamma Curies Decay')
    plt.scatter(boundary_times, boundary_curies, color='red', s=50, 
                label='Nearest Neighbor Boundaries (High Rate of Change)', zorder=5)
    
    plt.title(f'Gamma Curies with "Nearest Neighbor Boundaries" for {isotope_name}')
    plt.xlabel('Time (days)')
    plt.ylabel('Gamma Curies (Ci)')
    plt.grid(True)
    plt.legend()
    plt.tight_layout()
    plt.show()

    # Optional: Plot the derivative itself for more insight
    plt.figure(figsize=(10, 5))
    plt.plot(time_series_days, derivative_curies, label='Rate of Change (dCuries/dt)')
    plt.axhline(y=derivative_threshold_abs, color='grey', linestyle='--', label='Positive Threshold')
    plt.axhline(y=-derivative_threshold_abs, color='grey', linestyle='--', label='Negative Threshold')
    plt.title('Rate of Change of Gamma Curies over Time')
    plt.xlabel('Time (days)')
    plt.ylabel('dCuries/dt (Ci/day)')
    plt.grid(True)
    plt.legend()
    plt.tight_layout()
    plt.show()

    print(f"\n  Identified {len(boundary_times)} data points in 'Nearest Neighbor Boundary' regions.")
    if len(boundary_times) > 0:
        print("  Example Boundary Points (Time, Curies, Derivative):")
        # Print a few examples from the identified boundary points
        for i in range(min(5, len(boundary_times))):
            idx = boundary_indices[0][i] # Get the original index from the series
            print(f"    Time: {time_series_days[idx]:.2f} days, Curies: {curies_series[idx]:.2e} Ci, Derivative: {derivative_curies[idx]:.2e} Ci/day")

def simulate_blackbody_wien_displacement(temperatures_K, wavelength_range_nm=(1, 10000)):
    """
    Simulates blackbody spectral radiance across a range of wavelengths for given temperatures,
    and demonstrates Wien's Displacement Law by showing how the peak emission wavelength
    ("displaces" or shifts) with temperature.
    
    Args:
        temperatures_K (list): A list of temperatures in Kelvin (K) to simulate.
        wavelength_range_nm (tuple): A tuple (min_wavelength, max_wavelength) in nanometers.
    """
    print(f"\n--- Simulating Blackbody Spectral Radiance & Wien's Displacement Law ---")
    print(f"  Analyzing temperatures: {temperatures_K} K")
    
    wavelengths_nm = np.linspace(wavelength_range_nm[0], wavelength_range_nm[1], 500)
    wavelengths_m = wavelengths_nm * 1e-9 # Convert nanometers to meters

    peak_wavelengths_data = [] # To store (Temperature, Peak Wavelength) for Wien's Law plot

    plt.figure(figsize=(12, 7))
    for T in temperatures_K:
        if T <= 0:
            print(f"  Skipping temperature {T}K as it is non-positive.")
            continue
        radiance_series = [calculate_blackbody_spectral_radiance(wl_m, T) for wl_m in wavelengths_m]
        
        # Find the peak wavelength for the current temperature
        max_radiance_idx = np.argmax(radiance_series)
        peak_wl_m = wavelengths_m[max_radiance_idx]
        peak_wl_nm = wavelengths_nm[max_radiance_idx]
        peak_wavelengths_data.append((T, peak_wl_nm))

        plt.plot(wavelengths_nm, radiance_series, label=f'T = {T} K (Peak: {peak_wl_nm:.2f} nm)')
    
    plt.title('Blackbody Spectral Radiance (Planck\'s Law)')
    plt.xlabel('Wavelength (nm)')
    plt.ylabel('Spectral Radiance (W/(m³·sr))')
    plt.yscale('log') # Log scale helps visualize broad range of radiance values
    plt.xscale('log') # Log scale for wavelength can also be useful for broad ranges
    plt.grid(True, which="both", ls="--", alpha=0.7)
    plt.legend()
    plt.tight_layout()
    plt.show()

    print("\n--- Wien's Displacement Law Data Series ---")
    print("  This shows how the peak emission wavelength 'displaces' (shifts) to shorter wavelengths as temperature increases.")
    
    temp_list_sorted = [data[0] for data in sorted(peak_wavelengths_data)]
    peak_wl_list = [data[1] for data in sorted(peak_wavelengths_data)]
    
    for T, peak_wl_nm in sorted(peak_wavelengths_data):
        print(f"  Temperature: {T} K, Calculated Peak Wavelength: {peak_wl_nm:.2f} nm")
        
    plt.figure(figsize=(8, 6))
    plt.plot(temp_list_sorted, peak_wl_list, 'o-', label='Calculated Peak Wavelength')
    
    # Calculate and plot the theoretical curve from Wien's Displacement Law
    theoretical_peak_nm = (WIEN_DISPLACEMENT_CONSTANT / np.array(temp_list_sorted)) * 1e9
    plt.plot(temp_list_sorted, theoretical_peak_nm, 'r--', label='Theoretical (Wien\'s Law)')

    plt.title('Wien\'s Displacement Law: Peak Wavelength vs. Temperature')
    plt.xlabel('Temperature (K)')
    plt.ylabel('Peak Wavelength (nm)')
    plt.grid(True)
    plt.legend()
    plt.tight_layout()
    plt.show()

# --- Main Program Execution ---
if __name__ == "__main__":
    print("--- Radiation Analysis Program: Interpreting Data Behavior ---")
    print("\nThis program helps you understand how different scientific concepts can be analyzed using statistical ideas.")
    print("We'll explore two scenarios:")
    print("  1.  **'Nearest Neighbor Boundaries' in Gamma Curies:** This identifies points where the rate of radioactive decay is changing most rapidly, relative to measurement times that follow a bell curve.")
    print("  2.  **Blackbody Wavelength 'Displacement':** This demonstrates how the peak wavelength of light emitted by a hot object shifts as its temperature changes (Wien's Law).\n")
    print("Please note: The terms 'Nearest Neighbor Boundaries' and 'Edge States' are statistical interpretations here, not direct physical phenomena like in quantum mechanics.\n")

    while True:
        main_choice = input("Select: (N)earest Neighbor Boundaries (Gamma Curies), (B)lackbody Wavelength Displacement, or (Q)uit? ").strip().lower()

        if main_choice == 'q':
            break
        elif main_choice == 'n':
            print("\n--- 'Nearest Neighbor Boundaries' for Gamma Curies ---")
            isotope_name = input("  Enter radioactive isotope name (e.g., Cesium-137, Cobalt-60): ").strip()
            # Check if isotope data is available either manually or via PubChem
            if isotope_name not in MANUAL_GAMMA_DATA and _get_compound_cid_pubchem(isotope_name) is None:
                print(f"  '{isotope_name}' data not found. Please check spelling or add it to MANUAL_GAMMA_DATA.")
                continue
            
            try:
                initial_activity = float(input(f"  Enter initial activity of {isotope_name} in Becquerels (Bq): "))
                mean_time = float(input(f"  Enter mean measurement time for the time distribution (days): "))
                std_dev_time = float(input(f"  Enter standard deviation for the measurement time (days): "))
                num_points = int(input(f"  Enter number of data points for the simulation (e.g., 200): "))
                
                deriv_thresh_input = input(f"  Enter absolute derivative threshold for 'boundaries' (Ci/day, leave blank for auto-calc): ")
                derivative_threshold_abs = float(deriv_thresh_input) if deriv_thresh_input else None

            except ValueError:
                print("  Invalid input. Please enter numerical values.")
                continue
            
            analyze_gamma_curies_nearest_neighbor_boundaries(
                isotope_name, initial_activity, mean_time, std_dev_time, num_points, derivative_threshold_abs)
            
            time.sleep(1) # Small pause for readability

        elif main_choice == 'b':
            print("\n--- Blackbody Wavelength 'Displacement' (Wien's Law) ---")
            temp_input_str = input("  Enter temperatures (K) separated by commas (e.g., 300, 1000, 5000): ")
            try:
                temperatures_list = [float(t.strip()) for t in temp_input_str.split(',')]
            except ValueError:
                print("  Invalid temperature input. Please enter numbers separated by commas.")
                continue
            
            simulate_blackbody_wien_displacement(temperatures_list)
            
            time.sleep(1) # Small pause for readability

        else:
            print("Invalid choice. Please enter 'N', 'B', or 'Q'.")

    print("\nExiting program. Thank you for exploring these concepts!")

 

 General Relativism as a K-Body factor for chemical study.

I postulate there is no Universe in a black hole :-o 

 import numpy as np
import matplotlib.pyplot as plt
import pubchempy as pcp
import requests
import json
import time
import re

# --- Scientific Constants ---
# Nuclear Physics Constants
CURIE_TO_BQ = 3.7e10           # Conversion factor: 1 Curie = 3.7 x 10^10 Becquerels (disintegrations per second)
LN2 = np.log(2)                # Natural logarithm of 2, used in half-life calculations

# Blackbody Radiation Constants
STEFAN_BOLTZMANN_CONSTANT = 5.670374419e-8 # Stefan-Boltzmann constant in W/(m^2·K^4)
PLANCK_CONSTANT = 6.62607015e-34 # Planck's constant in J·s
BOLTZMANN_CONSTANT = 1.380649e-23 # Boltzmann constant in J/K
SPEED_OF_LIGHT = 2.99792458e8 # Speed of light in a vacuum in m/s
WIEN_DISPLACEMENT_CONSTANT = 2.897771955e-3 # Wien's displacement law constant in m·K

# General Relativity Constants (for the new functions)
GRAVITATIONAL_CONSTANT = 6.67430e-11 # G in m³·kg⁻¹·s⁻²
SPEED_OF_LIGHT_SQ = SPEED_OF_LIGHT**2

# --- Manual Nuclear Data for Gamma Emission ---
MANUAL_GAMMA_DATA = {
    "Cesium-137": {
        "half_life_days": 30.08 * 365.25,
        "gamma_energies_MeV": [0.662],
        "gamma_yields_per_decay": [0.851]
    },
    "Cobalt-60": {
        "half_life_days": 1925.21,
        "gamma_energies_MeV": [1.173, 1.332],
        "gamma_yields_per_decay": [0.999, 0.999]
    },
    "Iodine-131": {
        "half_life_days": 8.02,
        "gamma_energies_MeV": [0.364],
        "gamma_yields_per_decay": [0.817]
    },
    "Potassium-40": {
        "half_life_days": 1.251e9 * 365.25,
        "gamma_energies_MeV": [1.4608],
        "gamma_yields_per_decay": [0.1067]
    },
}

# --- PubChem Helper Functions (Unchanged) ---
def _get_compound_cid_pubchem(compound_name):
    try:
        compounds = pcp.get_compounds(compound_name, 'name')
        return compounds[0].cid if compounds else None
    except Exception:
        return None

def _fetch_half_life_from_pubchem(cid):
    base_url = "https://pubchem.ncbi.nlm.nih.gov/rest/pug_view/data/compound/"
    url = f"{base_url}cid/{cid}/JSON/"
    try:
        response = requests.get(url, timeout=5)
        response.raise_for_status()
        data = response.json()
        search_sections = ['Atomic Mass, Half Life, and Decay', 'Chemical and Physical Properties', 'Depositor-Supplied']
        for record in data.get('Record', {}).get('Section', []):
            if record.get('TOCHeading') in search_sections:
                for prop_section in record.get('Section', []):
                    if 'half-life' in prop_section.get('TOCHeading', '').lower():
                        for item in prop_section.get('Information', []):
                            for value_dict in item.get('Value', {}).get('StringWithMarkup', []):
                                text = value_dict.get('String')
                                if text:
                                    match = re.match(r"(\d+\.?\d*(?:e[+-]?\d+)?)\s*(seconds|minutes|hours|days|years|s|min|hr|d|y)", text.lower())
                                    if match:
                                        return float(match.group(1)), match.group(2)
        return None, None
    except Exception:
        return None, None

def _convert_half_life_to_days(value, unit):
    if unit in ['s', 'seconds']: return value / (24 * 3600)
    if unit in ['min', 'minutes']: return value / (24 * 60)
    if unit in ['hr', 'hours']: return value / 24
    if unit in ['d', 'days']: return value
    if unit in ['y', 'years']: return value * 365.25
    return None

# --- Core Radiation Calculation Functions (Modified to accept dilation factor) ---
def calculate_decay_constant(half_life_days):
    if half_life_days is None or half_life_days <= 0:
        return 0
    half_life_seconds = half_life_days * 24 * 3600
    return LN2 / half_life_seconds

def calculate_activity_at_time(initial_activity_Bq, decay_constant, time_elapsed_seconds):
    if decay_constant == 0:
        return initial_activity_Bq
    return initial_activity_Bq * np.exp(-decay_constant * time_elapsed_seconds)

def calculate_blackbody_spectral_radiance(wavelength_m, temperature_K):
    if temperature_K <= 0 or wavelength_m <= 0:
        return 0.0
    term1 = (2 * PLANCK_CONSTANT * SPEED_OF_LIGHT_SQ) / (wavelength_m**5)
    term2_exp_arg = (PLANCK_CONSTANT * SPEED_OF_LIGHT) / (wavelength_m * BOLTZMANN_CONSTANT * temperature_K)
    if term2_exp_arg > 700:
        return 0.0
    spectral_radiance = term1 / (np.exp(term2_exp_arg) - 1)
    return spectral_radiance

# --- UPDATED: General Relativity Functions ---
def calculate_schwarzschild_radius(mass_kg):
    """Calculates the Schwarzschild radius for a given mass."""
    if mass_kg <= 0:
        return 0.0
    return (2 * GRAVITATIONAL_CONSTANT * mass_kg) / SPEED_OF_LIGHT_SQ

def calculate_dilation_factor(mass_kg, distance_m):
    """
    Calculates the time dilation factor for an observer at infinity relative to an object in a gravitational field.
    A factor of 1 means no dilation. Values < 1 mean time is dilated (slowed down).
    """
    if mass_kg <= 0 or distance_m <= 0:
        return 1.0 # No dilation for zero mass or distance
    
    schwarzschild_radius = calculate_schwarzschild_radius(mass_kg)
    if distance_m <= schwarzschild_radius:
        return 0.0 # At or inside event horizon, time stops for an outside observer
    
    return np.sqrt(1 - (schwarzschild_radius / distance_m))

def calculate_relativistic_activity(isotope_name, initial_activity_Bq, time_elapsed_days, mass_kg, distance_m):
    """
    Calculates activity as observed by a distant observer, accounting for gravitational time dilation.
    """
    half_life_days = MANUAL_GAMMA_DATA.get(isotope_name, {}).get("half_life_days")
    if half_life_days is None:
        cid = _get_compound_cid_pubchem(isotope_name)
        if cid:
            hl_val, hl_unit = _fetch_half_life_from_pubchem(cid)
            if hl_val:
                half_life_days = _convert_half_life_to_days(hl_val, hl_unit)

    if half_life_days is None:
        return None

    decay_constant = calculate_decay_constant(half_life_days)
    time_elapsed_seconds = time_elapsed_days * 24 * 3600

    # Apply time dilation
    dilation_factor = calculate_dilation_factor(mass_kg, distance_m)
    
    # An observer far away sees time running slower for the object, so the decay process appears slowed.
    effective_time_local_seconds = time_elapsed_seconds * dilation_factor
    
    current_activity_Bq = calculate_activity_at_time(initial_activity_Bq, decay_constant, effective_time_local_seconds)
    
    gamma_yields = MANUAL_GAMMA_DATA.get(isotope_name, {}).get("gamma_yields_per_decay", [0])
    total_gamma_yield_per_decay = sum(gamma_yields)
    total_gamma_disintegrations_per_second = current_activity_Bq * total_gamma_yield_per_decay
    gamma_curies = total_gamma_disintegrations_per_second / CURIE_TO_BQ

    return gamma_curies

def calculate_relativistic_peak_wavelength(local_temperature_K, mass_kg, distance_m):
    """
    Calculates the observed peak wavelength of a blackbody, accounting for gravitational redshift.
    """
    if local_temperature_K <= 0:
        return None

    # Gravitational redshift means a distant observer sees the blackbody as cooler.
    dilation_factor = calculate_dilation_factor(mass_kg, distance_m)
    observed_temperature_K = local_temperature_K * dilation_factor

    # Use Wien's law with the observed temperature
    if observed_temperature_K <= 0:
        return np.inf # Effectively no peak wavelength observed
    
    peak_wavelength_m = WIEN_DISPLACEMENT_CONSTANT / observed_temperature_K
    return peak_wavelength_m * 1e9 # Convert to nm

# --- NEW: Quantum and Gravitational Bridge Functions ---
def calculate_gravitational_curvature_parameter(mass_kg, distance_m):
    """
    A simplified, unitless parameter representing local gravitational curvature.
    Higher values imply greater curvature.
    """
    schwarzschild_radius = calculate_schwarzschild_radius(mass_kg)
    if distance_m <= 0 or distance_m <= schwarzschild_radius:
        return np.inf
    return schwarzschild_radius / distance_m

def quantum_differentiation_hypothetical(curvature_parameter, base_quantum_value):
    """
    Hypothetical function to illustrate 'quantum mechanical differentiation'.
    This is NOT based on established physics but serves as an illustrative model.
    It proposes that a quantum value (e.g., a spin state) changes with gravitational curvature.
    
    For this example, we'll model it as a simple linear relationship for visualization.
    """
    if curvature_parameter >= 1:
        return 0 # At or inside the event horizon, the effect is undefined or at its maximum
    
    # A hypothetical effect: the base value is 'differentiated' by the curvature.
    # We use (1 - curvature_parameter) to get a value that goes from 1 to 0 as curvature increases.
    differentiated_value = base_quantum_value * (1 - curvature_parameter)
    return differentiated_value

# --- Utility Functions (Unchanged) ---
def generate_normal_distribution_series(mean, std_dev, num_points=200, spread_factor=3):
    start = mean - spread_factor * std_dev
    end = mean + spread_factor * std_dev
    if start < 0:
        start = 0
    series_values = np.linspace(start, end, num_points)
    pdf_values = (1 / (std_dev * np.sqrt(2 * np.pi))) * np.exp(-0.5 * ((series_values - mean) / std_dev)**2)
    return series_values, pdf_values

def get_gamma_curies_single_point(isotope_name, initial_activity_Bq, time_elapsed_days):
    half_life_days = None
    if isotope_name in MANUAL_GAMMA_DATA:
        half_life_days = MANUAL_GAMMA_DATA[isotope_name]["half_life_days"]
    else:
        cid = _get_compound_cid_pubchem(isotope_name)
        if cid:
            hl_val, hl_unit = _fetch_half_life_from_pubchem(cid)
            if hl_val:
                half_life_days = _convert_half_life_to_days(hl_val, hl_unit)

    if half_life_days is None:
        return None
    gamma_yields = MANUAL_GAMMA_DATA.get(isotope_name, {}).get("gamma_yields_per_decay", [0])
    total_gamma_yield_per_decay = sum(gamma_yields)
    decay_constant = calculate_decay_constant(half_life_days)
    time_elapsed_seconds = time_elapsed_days * 24 * 3600
    current_activity_Bq = calculate_activity_at_time(initial_activity_Bq, decay_constant, time_elapsed_seconds)
    total_gamma_disintegrations_per_second = current_activity_Bq * total_gamma_yield_per_decay
    gamma_curies = total_gamma_disintegrations_per_second / CURIE_TO_BQ
    return gamma_curies

# --- Analysis Functions (Modified to include new relativistic options) ---

def analyze_gamma_curies_comparison(isotope_name, initial_activity_Bq, mean_time_days, std_dev_time_days, mass_kg, distance_m, num_points=200):
    """
    Compares gamma curies decay in flat spacetime vs. near a massive k-body.
    """
    print(f"\n--- Comparing Gamma Curies: Flat Spacetime vs. Relativistic Environment ---")
    print(f"  Isotope: {isotope_name}, Initial Activity: {initial_activity_Bq:.2e} Bq")
    print(f"  Relativistic Environment: k-body Mass={mass_kg:.2e} kg, Distance={distance_m:.2e} m")
    
    schwarzschild_radius = calculate_schwarzschild_radius(mass_kg)
    dilation_factor = calculate_dilation_factor(mass_kg, distance_m)
    print(f"  Schwarzschild Radius: {schwarzschild_radius:.2e} m")
    print(f"  Gravitational Time Dilation Factor: {dilation_factor:.4f}")
    if dilation_factor < 0.99:
        print(f"  (A dilation factor less than 1 means time runs slower in the gravitational field.)")
    else:
        print(f"  (Dilation factor is near 1, indicating a weak gravitational effect.)")

    time_series_days, _ = generate_normal_distribution_series(mean_time_days, std_dev_time_days, num_points=num_points)

    # Flat Spacetime Calculation
    flat_curies_series = [get_gamma_curies_single_point(isotope_name, initial_activity_Bq, t_days) for t_days in time_series_days]

    # Relativistic Calculation
    relativistic_curies_series = [calculate_relativistic_activity(isotope_name, initial_activity_Bq, t_days, mass_kg, distance_m) for t_days in time_series_days]

    if all(c is None for c in flat_curies_series):
        print(f"  Calculation failed. Please ensure '{isotope_name}' has half-life data.")
        return

    # Plotting
    plt.figure(figsize=(12, 7))
    plt.plot(time_series_days, flat_curies_series, label='Flat Spacetime (Local Observer)')
    plt.plot(time_series_days, relativistic_curies_series, label=f'Relativistic (Distant Observer)', linestyle='--')
    plt.title(f'Gamma Curies Decay: {isotope_name} - Flat vs. Relativistic')
    plt.xlabel('Time (days)')
    plt.ylabel('Gamma Curies (Ci)')
    plt.grid(True)
    plt.legend()
    plt.tight_layout()
    plt.show()

def simulate_blackbody_wien_comparison(temperatures_K, mass_kg, distance_m, wavelength_range_nm=(1, 10000)):
    """
    Compares blackbody peak wavelength in flat spacetime vs. near a massive k-body.
    """
    print(f"\n--- Comparing Blackbody Peak Wavelength: Flat vs. Relativistic ---")
    print(f"  Analyzing temperatures: {temperatures_K} K")
    print(f"  Relativistic Environment: k-body Mass={mass_kg:.2e} kg, Distance={distance_m:.2e} m")

    schwarzschild_radius = calculate_schwarzschild_radius(mass_kg)
    dilation_factor = calculate_dilation_factor(mass_kg, distance_m)
    print(f"  Schwarzschild Radius: {schwarzschild_radius:.2e} m")
    print(f"  Gravitational Time Dilation Factor: {dilation_factor:.4f}")

    temp_list_sorted = sorted(temperatures_K)
    flat_peak_wl_list_nm = [(WIEN_DISPLACEMENT_CONSTANT / T) * 1e9 for T in temp_list_sorted if T > 0]
    
    relativistic_peak_wl_list_nm = [calculate_relativistic_peak_wavelength(T, mass_kg, distance_m) for T in temp_list_sorted if T > 0]

    # Plotting
    plt.figure(figsize=(8, 6))
    plt.plot(temp_list_sorted, flat_peak_wl_list_nm, 'o-', label='Flat Spacetime (Local Observer)')
    plt.plot(temp_list_sorted, relativistic_peak_wl_list_nm, 'r--', label='Relativistic (Distant Observer)')
    plt.title('Wien\'s Law: Peak Wavelength vs. Temperature (Flat vs. Relativistic)')
    plt.xlabel('Temperature (K)')
    plt.ylabel('Peak Wavelength (nm)')
    plt.grid(True)
    plt.legend()
    plt.tight_layout()
    plt.show()

def explore_quantum_gravitational_differentiation(mass_kg, distance_m_series):
    """
    New function to explore the hypothetical relationship between gravity and a quantum state.
    """
    print("\n--- Exploring a Hypothetical Gravitational-Quantum Bridge ---")
    print(f"  Analyzing Quantum State near a k-body with Mass={mass_kg:.2e} kg")

    curvature_series = [calculate_gravitational_curvature_parameter(mass_kg, d) for d in distance_m_series]
    
    # Let's use a base quantum value, for example, a spin state (up/down).
    # We'll use 1.0 as a base value to represent an "undifferentiated" quantum state.
    base_quantum_value = 1.0
    
    differentiated_value_series = [quantum_differentiation_hypothetical(c, base_quantum_value) for c in curvature_series]

    plt.figure(figsize=(10, 6))
    plt.plot(distance_m_series, differentiated_value_series, 'b-')
    plt.axvline(x=calculate_schwarzschild_radius(mass_kg), color='r', linestyle='--', label='Schwarzschild Radius')
    plt.title('Hypothetical "Quantum Differentiation" vs. Distance from a k-body')
    plt.xlabel('Distance from k-body (m)')
    plt.ylabel('Hypothetical Differentiated Quantum Value (Unitless)')
    plt.grid(True)
    plt.legend()
    plt.tight_layout()
    plt.show()


# --- Main Program Execution (Modified) ---
if __name__ == "__main__":
    print("--- Radiation Analysis Program: Interpreting Data Behavior ---")
    print("\nThis program explores two physics concepts in both a standard, flat-spacetime environment and a relativistic one.")
    
    print("1. **Radioactive Decay:** Compares the decay rate of an isotope as seen by a local observer vs. a distant one in a gravitational field.")
    print("2. **Blackbody Radiation:** Compares the peak emission wavelength of a blackbody as seen by a local observer vs. a distant one (gravitational redshift).")
    print("3. **Hypothetical Quantum-Gravity Bridge:** A new, conceptual section to explore how a quantum state might be 'differentiated' by a gravitational field.\n")

    while True:
        main_choice = input("Select: (R)elativistic Comparison, (Q)uantum-Gravity, (E)xit? ").strip().lower()

        if main_choice == 'e':
            break
        elif main_choice == 'r':
            sub_choice = input("Select Relativistic Scenario: (N)uclear Decay or (B)lackbody Radiation? ").strip().lower()

            try:
                # Get common relativistic parameters
                k_body_mass = float(input("\n  Enter mass of the k-body (e.g., a star) in kg: "))
                k_body_distance = float(input("  Enter distance from the center of the k-body in meters: "))
            except ValueError:
                print("  Invalid input. Please enter numerical values for mass and distance.")
                continue

            if sub_choice == 'n':
                print("\n--- Relativistic Nuclear Decay Comparison ---")
                isotope_name = input("  Enter radioactive isotope name (e.g., Cesium-137): ").strip()
                if isotope_name not in MANUAL_GAMMA_DATA and _get_compound_cid_pubchem(isotope_name) is None:
                    print(f"  '{isotope_name}' data not found.")
                    continue
                try:
                    initial_activity = float(input("  Enter initial activity in Bq: "))
                    mean_time = float(input("  Enter mean measurement time (days): "))
                    std_dev_time = float(input("  Enter standard deviation for time (days): "))
                except ValueError:
                    print("  Invalid input.")
                    continue
                
                analyze_gamma_curies_comparison(isotope_name, initial_activity, mean_time, std_dev_time, k_body_mass, k_body_distance)

            elif sub_choice == 'b':
                print("\n--- Relativistic Blackbody Radiation Comparison ---")
                temp_input_str = input("  Enter temperatures (K) separated by commas (e.g., 300, 1000, 5000): ")
                try:
                    temperatures_list = [float(t.strip()) for t in temp_input_str.split(',')]
                except ValueError:
                    print("  Invalid temperature input.")
                    continue
                simulate_blackbody_wien_comparison(temperatures_list, k_body_mass, k_body_distance)
            else:
                print("  Invalid choice. Please enter 'N' or 'B'.")

        elif main_choice == 'q':
            print("\n--- Quantum-Gravitational Bridge Exploration ---")
            try:
                k_body_mass = float(input("\n  Enter mass of the k-body (e.g., a black hole) in kg: "))
                # We'll generate a range of distances to show the effect
                max_distance_factor = float(input("  Enter max distance as a multiple of Schwarzschild Radius (e.g., 5): "))
                
                schwarzschild_radius_val = calculate_schwarzschild_radius(k_body_mass)
                if schwarzschild_radius_val == 0:
                    print("  Invalid mass. Cannot calculate Schwarzschild Radius.")
                    continue
                
                distance_series = np.linspace(schwarzschild_radius_val + 1e-9, max_distance_factor * schwarzschild_radius_val, 200)
                explore_quantum_gravitational_differentiation(k_body_mass, distance_series)

            except ValueError:
                print("  Invalid input. Please enter numerical values.")
                continue

        else:
            print("Invalid choice. Please enter 'R', 'Q', or 'E'.")

    print("\nExiting program. Thank you for exploring these concepts!")

0xEB Enthalpy of Formation - Pubchem API

 

Okay, this is a much more chemically relevant challenge, but it still requires a clear definition of "chemical stability" that can be programmatically accessed or inferred from public data.

"Chemical stability" can mean several things:

  1. Thermodynamic Stability (relative to elements): Often quantified by the standard enthalpy of formation (). A more negative (lower) generally indicates a more thermodynamically stable compound relative to its constituent elements in their standard states.
  2. Thermodynamic Stability (relative to decomposition/reaction products): Requires comparing Gibbs free energies of various possible reactions, which is far too complex for a general PubChem API script.
  3. Kinetic Stability: How fast a compound decomposes or reacts. This involves activation energies and reaction mechanisms, almost never found directly in PubChem's property listings.
  4. Stability against specific conditions: E.g., hydrolytic stability, thermal stability (decomposition temperature). These might be mentioned qualitatively in PubChem.

For a practical demonstration using PubChem-like data retrieval, focusing on thermodynamic stability as indicated by the standard enthalpy of formation () is the most feasible and quantifiable approach.

Important Considerations and Limitations:

  • Data Availability: Similar to critical properties, standard enthalpy of formation () is not consistently available for all compounds in PubChem. PubChem is a broad chemical information database, not a specialized thermochemical one.
  • Preferred Data Source: For reliable thermochemical data, specialized databases like the NIST Chemistry WebBook are far superior. While this program will attempt to retrieve from PubChem, be aware of its limitations.
  • Units: Enthalpy of formation can be in kJ/mol or kcal/mol. We'll need to standardize (e.g., to kJ/mol).
  • Parsing Fragility: Extracting these values from PubChem's text-based sections will again rely on heuristic string parsing, which can be fragile.

Program Design:

  1. Input: The user will provide one or more compound names.
  2. PubChem Lookup (CID): Use pubchempy to get the CID.
  3. Fetch Enthalpy of Formation: A new function will query PubChem's pug_view endpoint for Delta Hf data.
  4. Stability Analysis: If data is found for multiple compounds, compare their values to determine relative stability.
  5. Error Handling: Gracefully handle missing data and API failures.

Refitted Program Code (Chemical Stability via Enthalpy of Formation)

Python
import numpy as np # Not strictly needed for this version, but good practice for scientific tools
import pubchempy as pcp
import requests
import json
import time

# --- PubChem Integration Functions ---
def get_compound_cid(compound_name):
    """Searches PubChem for a compound by name and returns its CID."""
    try:
        compounds = pcp.get_compounds(compound_name, 'name')
        if compounds:
            print(f"  Found '{compound_name}'. CID: {compounds[0].cid}")
            return compounds[0].cid
        else:
            print(f"  No compound found for '{compound_name}'.")
            return None
    except Exception as e:
        print(f"  Error getting CID for '{compound_name}': {e}")
        return None

def fetch_enthalpy_of_formation_from_pubchem(cid):
    """
    Fetches the Standard Enthalpy of Formation (ΔH_f°) from PubChem using PUG-REST.
    This property is often found in 'Chemical and Physical Properties' or
    'Depositor-Supplied' sections.
    """
    base_url = "https://pubchem.ncbi.nlm.nih.gov/rest/pug_view/data/compound/"
    url = f"{base_url}cid/{cid}/JSON/"
    enthalpy_of_formation = None
    unit = None

    print(f"  Attempting to fetch detailed data for CID {cid} from PubChem for ΔH_f°...")
    try:
        response = requests.get(url, timeout=30) # 30-second timeout
        response.raise_for_status() # Raise an exception for HTTP errors (4xx or 5xx)
        data = response.json()

        # Search for enthalpy of formation in common sections
        search_terms = ['enthalpy of formation', 'heat of formation', 'standard enthalpy of formation']

        for record in data.get('Record', {}).get('Section', []):
            if record.get('TOCHeading') in ['Chemical and Physical Properties', 'Depositor-Supplied Synonyms and Properties']:
                for prop_section in record.get('Section', []):
                    heading_lower = prop_section.get('TOCHeading', '').lower()
                    
                    found_term = False
                    for term in search_terms:
                        if term in heading_lower:
                            found_term = True
                            break
                    
                    if found_term:
                        for item in prop_section.get('Information', []):
                            for value_dict in item.get('Value', {}).get('StringWithMarkup', []):
                                text = value_dict.get('String')
                                if text:
                                    # Attempt to parse value and unit
                                    # This parsing is highly heuristic and may fail for varied formats
                                    text_lower = text.lower()
                                    try:
                                        if 'kj/mol' in text_lower:
                                            # Find the number before 'kJ/mol'
                                            val_str = text_lower.split('kj/mol')[0].strip().split(' ')[-1]
                                            enthalpy_of_formation = float(val_str)
                                            unit = 'kJ/mol'
                                            print(f"    Found ΔH_f°: {enthalpy_of_formation} {unit}")
                                            return enthalpy_of_formation, unit
                                        elif 'kcal/mol' in text_lower:
                                            val_str = text_lower.split('kcal/mol')[0].strip().split(' ')[-1]
                                            enthalpy_of_formation = float(val_str)
                                            unit = 'kcal/mol'
                                            print(f"    Found ΔH_f°: {enthalpy_of_formation} {unit}")
                                            return enthalpy_of_formation, unit
                                        # Add more unit parsing as needed
                                    except ValueError:
                                        pass # Failed to parse as float, continue searching

        print("  Could not find Standard Enthalpy of Formation in PubChem for this compound.")
        return None, None

    except requests.exceptions.RequestException as e:
        print(f"  Network or API error while fetching PubChem data: {e}")
        return None, None
    except json.JSONDecodeError as e:
        print(f"  Error parsing JSON from PubChem: {e}")
        return None, None
    except Exception as e:
        print(f"  An unexpected error occurred: {e}")
        return None, None

def convert_to_kj_per_mol(value, unit):
    """Converts enthalpy values to kJ/mol for consistent comparison."""
    if unit == 'kJ/mol':
        return value
    elif unit == 'kcal/mol':
        return value * 4.184 # 1 kcal = 4.184 kJ
    else:
        print(f"Warning: Unknown unit '{unit}'. Cannot convert.")
        return None

def analyze_stability(compounds_data):
    """
    Analyzes and compares the thermodynamic stability of compounds based on
    their standard enthalpy of formation (ΔH_f°).
    """
    if not compounds_data:
        print("\nNo data available for stability analysis.")
        return

    print("\n--- Chemical Stability Analysis (via Standard Enthalpy of Formation) ---")
    print("  Lower (more negative) ΔH_f° generally indicates greater thermodynamic stability")
    print("  relative to elements in their standard states.\n")

    valid_data = []
    for name, data in compounds_data.items():
        if data['enthalpy_kj_mol'] is not None:
            valid_data.append(data)
            print(f"  {name}: ΔH_f° = {data['enthalpy_kj_mol']:.2f} kJ/mol")
        else:
            print(f"  {name}: ΔH_f° data not found or invalid.")
    
    if not valid_data:
        print("  No valid enthalpy of formation data found for comparison.")
        return

    # Find the most stable compound(s)
    most_stable_value = min(d['enthalpy_kj_mol'] for d in valid_data)
    most_stable_compounds = [d['name'] for d in valid_data if d['enthalpy_kj_mol'] == most_stable_value]

    print("\n--- Analysis Result ---")
    if len(most_stable_compounds) == 1:
        print(f"  The most thermodynamically stable compound is: {most_stable_compounds[0]}")
    else:
        print(f"  The most thermodynamically stable compounds (with similar ΔH_f°) are: {', '.join(most_stable_compounds)}")
    print(f"  (ΔH_f°: {most_stable_value:.2f} kJ/mol)")
    print("-" * 60)


# --- Main execution loop ---
if __name__ == "__main__":
    compounds_to_analyze = {}

    while True:
        compound_name_input = input("\nEnter compound name to analyze (or 'done' to analyze current list, 'exit' to quit): ").strip()
        if compound_name_input.lower() == 'exit':
            break
        if compound_name_input.lower() == 'done':
            if compounds_to_analyze:
                analyze_stability(compounds_to_analyze)
                # Clear the list for a new analysis
                compounds_to_analyze = {} 
            else:
                print("No compounds entered for analysis yet.")
            continue # Continue to next prompt

        print(f"\n--- Searching for '{compound_name_input}' ---")
        cid = get_compound_cid(compound_name_input)

        if cid:
            time.sleep(0.1) # Small delay for API rate limits
            enthalpy, unit = fetch_enthalpy_of_formation_from_pubchem(cid)
            
            enthalpy_kj_mol = None
            if enthalpy is not None and unit is not None:
                enthalpy_kj_mol = convert_to_kj_per_mol(enthalpy, unit)
                if enthalpy_kj_mol is not None:
                    print(f"  {compound_name_input}: ΔH_f° = {enthalpy_kj_mol:.2f} kJ/mol (Original: {enthalpy} {unit})")
                    compounds_to_analyze[compound_name_input] = {
                        'name': compound_name_input,
                        'cid': cid,
                        'enthalpy_kj_mol': enthalpy_kj_mol
                    }
                else:
                    print(f"  Could not get valid ΔH_f° in kJ/mol for '{compound_name_input}'.")
            else:
                print(f"  ΔH_f° data not found for '{compound_name_input}'.")
        else:
            print(f"  Failed to find CID for '{compound_name_input}'.")

    print("\nExiting program.")


How to Run and Test:

  1. Save: Save the code as a Python file (e.g., chemical_stability.py).
  2. Install Libraries: If you haven't already, install the necessary libraries:
    Bash
    pip install numpy pubchempy requests
    
    (Note: numpy is included as a common scientific library, though not strictly required for this specific stability calculation).
  3. Execute: Run the script from your terminal:
    Bash
    python chemical_stability.py
    
  4. Enter Compound Names:
    • The program will prompt you to enter compound names one by one.
    • Good Candidates (likely to have ΔH_f°): Try "carbon dioxide", "methane", "water", "ethanol", "ammonia", "benzene". These are common compounds for which thermochemical data is usually available.
    • Less Common Compounds: For many arbitrary or complex compounds, you will likely find that is not available in PubChem. The program will report this.
    • Type done: After entering several compounds, type done to initiate the stability analysis and compare the compounds you've entered.
    • Type exit: To quit the program.

Explanation of Changes:

  1. fetch_enthalpy_of_formation_from_pubchem(cid):

    • This is the core new function.
    • It uses direct PUG-REST calls (pug_view) to retrieve the full JSON record.
    • It now specifically searches for common terms like "enthalpy of formation" or "heat of formation" within the 'Chemical and Physical Properties' and 'Depositor-Supplied' sections.
    • Heuristic Parsing: It attempts to extract a numerical value and a unit (e.g., "kJ/mol", "kcal/mol") from the unstructured text. This part is still highly heuristic and will fail if the text format is unexpected.
    • Returns the value and unit if found, otherwise None, None.
  2. convert_to_kj_per_mol(value, unit):

    • A utility function to standardize all fetched enthalpy values to kJ/mol, which is crucial for direct comparison.
  3. analyze_stability(compounds_data):

    • This function takes a dictionary of compounds and their fetched enthalpy data.
    • It filters for compounds where was successfully retrieved and converted.
    • It then finds the minimum (most negative) among the valid entries to identify the "most stable" compound(s).
    • Provides an interpretation that a lower means greater stability.
  4. Main Execution Loop (if __name__ == "__main__":):

    • Modified to allow the user to input multiple compound names.
    • Stores the retrieved enthalpy_kj_mol for each compound in a compounds_to_analyze dictionary.
    • When the user types done, it calls analyze_stability with the collected data.
    • Includes time.sleep(0.1) for API rate limiting.
    • Error handling for API calls and data parsing is maintained.

This program provides a solid foundation for assessing one aspect of chemical stability using publicly available data, while clearly highlighting the challenges and limitations of relying on general chemical information databases for specific thermodynamic properties.