[{"content":"Introduction In this post, we will look into the Fed Funds cycles and evaluate asset class performance during loosening and tightening of monetary policy.\nPython Functions Here are the functions needed for this project:\ncalc_fed_cycle_asset_performance: Calculates the performance of various asset classes during the Fed Funds cycles. df_info: A simple function to display the information about a DataFrame and the first five rows and last five rows. df_info_markdown: Similar to the df_info function above, except that it coverts the output to markdown. export_track_md_deps: Exports various text outputs to markdown files, which are included in the index.md file created when building the site with Hugo. load_data: Load data from a CSV, Excel, or Pickle file into a pandas DataFrame. pandas_set_decimal_places: Set the number of decimal places displayed for floating-point numbers in pandas. plot_bar_returns_ffr_change: Plot the bar chart of the cumulative or annualized returns for the asset class along with the change in the Fed Funds Rate. plot_timeseries: Plot the timeseries data from a DataFrame for a specified date range and columns. plot_scatter_regression_ffr_vs_returns: Plot the scatter plot and regression of the annualized return for the asset class along with the annualized change in the Fed Funds Rate. yf_pull_data: Download daily price data from Yahoo Finance and export it. Data Overview Acquire \u0026amp; Plot Fed Funds Data First, let\u0026rsquo;s get the data for the Fed Funds rate (FFR):\n1 2 3 4 5 6 7 8 9 10 11 # Set decimal places pandas_set_decimal_places(4) # Pull Effective Fed Funds Rate from FRED fedfunds = web.DataReader(\u0026#34;FEDFUNDS\u0026#34;, \u0026#34;fred\u0026#34;, start=\u0026#34;1900-01-01\u0026#34;, end=datetime.today()) fedfunds[\u0026#34;FEDFUNDS\u0026#34;] = fedfunds[\u0026#34;FEDFUNDS\u0026#34;] / 100 # Convert to decimal # Resample to monthly frequency and compute change in rate fedfunds_monthly = fedfunds.resample(\u0026#34;M\u0026#34;).last() fedfunds_monthly = fedfunds_monthly[(fedfunds_monthly.index \u0026gt;= pd.to_datetime(start_date)) \u0026amp; (fedfunds_monthly.index \u0026lt;= pd.to_datetime(end_date))] fedfunds_monthly[\u0026#34;FedFunds_Change\u0026#34;] = fedfunds_monthly[\u0026#34;FEDFUNDS\u0026#34;].diff() This gives us:\n1 2 3 4 5 6 7 8 9 10 11 12 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 252 entries, 2004-11-30 to 2025-10-31 Freq: ME Data columns (total 2 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 FEDFUNDS 252 non-null float64 1 FedFunds_Change 251 non-null float64 dtypes: float64(2) memory usage: 5.9 KB The first 5 rows are:\nDATE FEDFUNDS FedFunds_Change 2004-11-30 00:00:00 0.0193 nan 2004-12-31 00:00:00 0.0216 0.0023 2005-01-31 00:00:00 0.0228 0.0012 2005-02-28 00:00:00 0.0250 0.0022 2005-03-31 00:00:00 0.0263 0.0013 The last 5 rows are:\nDATE FEDFUNDS FedFunds_Change 2025-06-30 00:00:00 0.0433 0.0000 2025-07-31 00:00:00 0.0433 0.0000 2025-08-31 00:00:00 0.0433 0.0000 2025-09-30 00:00:00 0.0422 -0.0011 2025-10-31 00:00:00 0.0409 -0.0013 We can then generate several useful visual aids (plots). First, the FFR from the beginning of our data set (11/2004):\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 plot_timeseries( price_df=fedfunds_monthly, plot_start_date=start_date, plot_end_date=end_date, plot_columns=[\u0026#34;FEDFUNDS\u0026#34;], title=\u0026#34;Fed Funds Rate\u0026#34;, x_label=\u0026#34;Date\u0026#34;, x_format=\u0026#34;Year\u0026#34;, y_label=\u0026#34;Rate (%)\u0026#34;, y_format=\u0026#34;Percentage\u0026#34;, y_format_decimal_places=1, y_tick_spacing=0.005, grid=True, legend=False, export_plot=True, plot_file_name=\u0026#34;01_Fed_Funds_Rate\u0026#34;, ) And then the change in FFR from month-to-month:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 plot_timeseries( price_df=fedfunds_monthly, plot_start_date=start_date, plot_end_date=end_date, plot_columns=[\u0026#34;FedFunds_Change\u0026#34;], title=\u0026#34;Fed Funds Change In Rate\u0026#34;, x_label=\u0026#34;Date\u0026#34;, x_format=\u0026#34;Year\u0026#34;, y_label=\u0026#34;Rate (%)\u0026#34;, y_format=\u0026#34;Percentage\u0026#34;, y_format_decimal_places=2, y_tick_spacing=0.0025, grid=True, legend=False, export_plot=True, plot_file_name=\u0026#34;01_Fed_Funds_Change_In_Rate\u0026#34;, ) This plot, in particular, makes it easy to show the monthly increase and decrease in the FFR, as well as the magnitude of the change (i.e. slow, drawn-out increases or decreases or abrupt large increases or decreases).\nDefine Fed Policy Cycles Next, we will define the Fed policy tightening and loosening cycles. This is done via visual inspection of the FFR plot and establishing some timeframes for when the cycles started and ended. Here\u0026rsquo;s the list of cycles:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # Define manually specified Fed policy cycles fed_cycles = [ (\u0026#34;2004-11-01\u0026#34;, \u0026#34;2006-07-01\u0026#34;), (\u0026#34;2006-07-01\u0026#34;, \u0026#34;2007-07-01\u0026#34;), (\u0026#34;2007-07-01\u0026#34;, \u0026#34;2008-12-01\u0026#34;), (\u0026#34;2008-12-01\u0026#34;, \u0026#34;2015-11-01\u0026#34;), (\u0026#34;2015-11-01\u0026#34;, \u0026#34;2019-01-01\u0026#34;), (\u0026#34;2019-01-01\u0026#34;, \u0026#34;2019-07-01\u0026#34;), (\u0026#34;2019-07-01\u0026#34;, \u0026#34;2020-04-01\u0026#34;), (\u0026#34;2020-04-01\u0026#34;, \u0026#34;2022-02-01\u0026#34;), (\u0026#34;2022-02-01\u0026#34;, \u0026#34;2023-08-01\u0026#34;), (\u0026#34;2023-08-01\u0026#34;, \u0026#34;2024-08-01\u0026#34;), (\u0026#34;2024-08-01\u0026#34;, datetime.today().strftime(\u0026#39;%Y-%m-%d\u0026#39;)), ] # Optional: assign a name to each cycle cycle_labels = [f\u0026#34;Cycle {i+1}\u0026#34; for i in range(len(fed_cycles))] And here\u0026rsquo;s the list of total change in the FFR corresponding to each cycle:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 # Set decimal places pandas_set_decimal_places(4) # Calc changes by fed cycle defined above fed_changes = [] for (start, end) in fed_cycles: start = pd.to_datetime(start) end = pd.to_datetime(end) try: rate_start = fedfunds.loc[start, \u0026#34;FEDFUNDS\u0026#34;] except KeyError: rate_start = fedfunds.loc[:start].iloc[-1][\u0026#34;FEDFUNDS\u0026#34;] try: rate_end = fedfunds.loc[end, \u0026#34;FEDFUNDS\u0026#34;] except KeyError: rate_end = fedfunds.loc[:end].iloc[-1][\u0026#34;FEDFUNDS\u0026#34;] change = rate_end - rate_start fed_changes.append(change) fed_changes_df = pd.DataFrame({ \u0026#34;Cycle\u0026#34;: cycle_labels, \u0026#34;FedFunds_Change\u0026#34;: fed_changes }) Which gives us the following cycles and cumulative change in rate per cycle:\nCycle FedFunds_Change 0 Cycle 1 0.0331 1 Cycle 2 0.0002 2 Cycle 3 -0.0510 3 Cycle 4 -0.0004 4 Cycle 5 0.0228 5 Cycle 6 0.0000 6 Cycle 7 -0.0235 7 Cycle 8 0.0003 8 Cycle 9 0.0525 9 Cycle 10 0.0000 10 Cycle 11 -0.0145 Return Performance By Fed Policy Cycle Moving on, we will now look at the performance of three (3) different asset classes during each Fed cycle. We\u0026rsquo;ll use SPY as a proxy for stocks, TLT as a proxy for bonds, and GLD as a proxy for gold. These datasets are slightly limiting due to the availability of all 3 starting in late 2004, but will work for our simple exercise. In a future post, we\u0026rsquo;ll look to use Bloomberg indices instead.\nStocks (SPY) First, we pull data for SPY with the following:\n1 2 3 4 5 6 7 8 9 10 11 12 # Set decimal places pandas_set_decimal_places(2) yf_pull_data( base_directory=DATA_DIR, ticker=\u0026#34;SPY\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Exchange_Traded_Funds\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) And then load data with the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 spy = load_data( base_directory=DATA_DIR, ticker=\u0026#34;SPY\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Exchange_Traded_Funds\u0026#34;, timeframe=\u0026#34;Daily\u0026#34;, file_format=\u0026#34;pickle\u0026#34;, ) # Filter SPY to date range spy = spy[(spy.index \u0026gt;= pd.to_datetime(start_date)) \u0026amp; (spy.index \u0026lt;= pd.to_datetime(end_date))] # Resample to monthly frequency spy_monthly = spy.resample(\u0026#34;M\u0026#34;).last() spy_monthly[\u0026#34;Monthly_Return\u0026#34;] = spy_monthly[\u0026#34;Close\u0026#34;].pct_change() Which gives us the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 252 entries, 2004-11-30 to 2025-10-31 Freq: ME Data columns (total 6 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Close 252 non-null float64 1 High 252 non-null float64 2 Low 252 non-null float64 3 Open 252 non-null float64 4 Volume 252 non-null int64 5 Monthly_Return 251 non-null float64 dtypes: float64(5), int64(1) memory usage: 13.8 KB The first 5 rows are:\nDate Close High Low Open Volume Monthly_Return 2004-11-30 00:00:00 79.83 80.07 79.66 79.90 53685200.00 nan 2004-12-31 00:00:00 82.23 82.77 82.19 82.53 28648800.00 0.03 2005-01-31 00:00:00 80.39 80.45 80.09 80.25 52532700.00 -0.02 2005-02-28 00:00:00 82.07 82.53 81.67 82.43 69381300.00 0.02 2005-03-31 00:00:00 80.57 80.91 80.51 80.73 64575400.00 -0.02 The last 5 rows are:\nDate Close High Low Open Volume Monthly_Return 2025-06-30 00:00:00 616.14 617.51 613.34 615.67 92502500.00 0.05 2025-07-31 00:00:00 630.33 638.08 629.03 637.69 103385200.00 0.02 2025-08-31 00:00:00 643.27 646.05 641.36 645.68 74522200.00 0.02 2025-09-30 00:00:00 666.18 666.65 661.61 662.93 86288000.00 0.04 2025-10-31 00:00:00 682.06 685.08 679.24 685.04 87164100.00 0.02 Next, we can plot the price history before calculating the cycle performance:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 plot_timeseries( price_df=spy, plot_start_date=start_date, plot_end_date=end_date, plot_columns=[\u0026#34;Close\u0026#34;], title=\u0026#34;SPY Close Price\u0026#34;, x_label=\u0026#34;Date\u0026#34;, x_format=\u0026#34;Year\u0026#34;, y_label=\u0026#34;Price ($)\u0026#34;, y_format=\u0026#34;Decimal\u0026#34;, y_tick_spacing=50, grid=True, legend=False, export_plot=True, plot_file_name=\u0026#34;02_SPY_Price\u0026#34;, y_format_decimal_places=0, ) Next, we will calculate the performance for SPY based on the pre-defined Fed cycles:\n1 2 3 4 5 6 spy_cycle_df = calc_fed_cycle_asset_performance( fed_cycles=fed_cycles, cycle_labels=cycle_labels, fed_changes=fed_changes, monthly_df=spy_monthly, ) Which gives us:\nCycle Start End Months CumulativeReturn CumulativeReturnPct AverageMonthlyReturn AverageMonthlyReturnPct AnnualizedReturn AnnualizedReturnPct Volatility FedFundsChange FedFundsChange_bps FFR_AnnualizedChange FFR_AnnualizedChange_bps Label 0 Cycle 1 2004-11-01 2006-07-01 20 0.11 11.32 0.01 0.59 0.07 6.64 0.08 0.03 331.00 0.02 198.60 Cycle 1, 2004-11-01 to 2006-07-01 1 Cycle 2 2006-07-01 2007-07-01 12 0.20 20.36 0.02 1.57 0.20 20.36 0.07 0.00 2.00 0.00 2.00 Cycle 2, 2006-07-01 to 2007-07-01 2 Cycle 3 2007-07-01 2008-12-01 17 -0.39 -38.55 -0.03 -2.67 -0.29 -29.09 0.19 -0.05 -510.00 -0.04 -360.00 Cycle 3, 2007-07-01 to 2008-12-01 3 Cycle 4 2008-12-01 2015-11-01 83 1.67 167.34 0.01 1.28 0.15 15.28 0.15 -0.00 -4.00 -0.00 -0.58 Cycle 4, 2008-12-01 to 2015-11-01 4 Cycle 5 2015-11-01 2019-01-01 38 0.28 28.30 0.01 0.70 0.08 8.19 0.11 0.02 228.00 0.01 72.00 Cycle 5, 2015-11-01 to 2019-01-01 5 Cycle 6 2019-01-01 2019-07-01 6 0.18 18.33 0.03 2.95 0.40 40.01 0.18 0.00 0.00 0.00 0.00 Cycle 6, 2019-01-01 to 2019-07-01 6 Cycle 7 2019-07-01 2020-04-01 9 -0.11 -10.67 -0.01 -1.10 -0.14 -13.96 0.19 -0.02 -235.00 -0.03 -313.33 Cycle 7, 2019-07-01 to 2020-04-01 7 Cycle 8 2020-04-01 2022-02-01 22 0.79 79.13 0.03 2.78 0.37 37.43 0.16 0.00 3.00 0.00 1.64 Cycle 8, 2020-04-01 to 2022-02-01 8 Cycle 9 2022-02-01 2023-08-01 18 0.04 4.18 0.00 0.40 0.03 2.77 0.21 0.05 525.00 0.03 350.00 Cycle 9, 2022-02-01 to 2023-08-01 9 Cycle 10 2023-08-01 2024-08-01 12 0.22 22.00 0.02 1.75 0.22 22.00 0.15 0.00 0.00 0.00 0.00 Cycle 10, 2023-08-01 to 2024-08-01 10 Cycle 11 2024-08-01 2025-12-09 15 0.26 25.72 0.02 1.59 0.20 20.09 0.11 -0.01 -145.00 -0.01 -116.00 Cycle 11, 2024-08-01 to 2025-12-09 This gives us the following data points:\nCycle start date Cycle end date Number of months in the cycle Cumulative return during the cycle (decimal and percent) Average monthly return during the cycle (decimal and percent) Annualized return during the cycle (decimal and percent) Return volatility during the cycle Cumulative change in FFR during the cycle (decimal and basis points) Annualized change in FFR during the cycle (decimal and basis points) From the above DataFrame, we can then plot the cumulative and annualized returns for each cycle in a bar chart. First, the cumulative returns along with the cumulative change in FFR:\nAnd then the annualized returns along with the annualized change in FFR:\nThe cumulative returns plot is not particularly insightful, but there are some interesting observations to be gained from the annualized returns plot. During the past two (2) rate cutting cycles (cycles 3 and 7), stocks have exhibited negative returns during the rate cutting cycle. However, after the rate cutting cycle was complete, returns during the following cycle (when rates were usually flat) were quite strong and higher than the historical mean return for the S\u0026amp;P 500. The economic intuition for this behavior is valid; as the economy weakens, investors are concerned about the pricing of equities, the returns become negative, and the Fed responds with cutting rates. The exact timing of when the Fed begins cutting rates is one of the unknowns; the Fed could be ahead of the curve, cutting rates as economic data begins to prompt that action, or behind the curve, where the ecomony rolls over rapidly and even the Fed\u0026rsquo;s actions are not enough to halt the economic contraction.\nFinally, we can run an OLS regression to check fit:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 df = spy_cycle_df #################################### ### Don\u0026#39;t modify below this line ### #################################### # Run OLS regression with statsmodels X = df[\u0026#34;FFR_AnnualizedChange_bps\u0026#34;] y = df[\u0026#34;AnnualizedReturnPct\u0026#34;] X = sm.add_constant(X) model = sm.OLS(y, X).fit() print(model.summary()) print(f\u0026#34;Intercept: {model.params[0]}, Slope: {model.params[1]}\u0026#34;) # Intercept and slope # Calc X and Y values for regression line X_vals = np.linspace(X.min(), X.max(), 100) Y_vals = model.params[0] + model.params[1] * X_vals Which gives us the results of the OLS regression:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 OLS Regression Results =============================================================================== Dep. Variable: AnnualizedReturnPct R-squared: 0.180 Model: OLS Adj. R-squared: 0.089 Method: Least Squares F-statistic: 1.973 Date: Tue, 09 Dec 2025 Prob (F-statistic): 0.194 Time: 01:22:30 Log-Likelihood: -47.173 No. Observations: 11 AIC: 98.35 Df Residuals: 9 BIC: 99.14 Df Model: 1 Covariance Type: nonrobust ============================================================================================ coef std err t P\u0026gt;|t| [0.025 0.975] -------------------------------------------------------------------------------------------- const 12.4404 5.894 2.111 0.064 -0.893 25.774 FFR_AnnualizedChange_bps 0.0430 0.031 1.405 0.194 -0.026 0.112 ============================================================================== Omnibus: 1.065 Durbin-Watson: 3.078 Prob(Omnibus): 0.587 Jarque-Bera (JB): 0.665 Skew: 0.026 Prob(JB): 0.717 Kurtosis: 1.796 Cond. No. 193. ============================================================================== Notes: [1] Standard Errors assume that the covariance matrix of the errors is correctly specified. And then plot the regression line along with the values:\n1 2 3 4 5 6 7 8 9 plot_scatter_regression_ffr_vs_returns( cycle_df=spy_cycle_df, asset_label=\u0026#34;SPY\u0026#34;, index_num=\u0026#34;02\u0026#34;, x_vals=X_vals, y_vals=Y_vals, intercept=model.params[0], slope=model.params[1], ) Which gives us:\nHere we can see the data points for cycles 3 and 7 as mentioned above. Ignoring the data points where the annualized change in FFR is roughly zero (cycles 2, 4, 6, 8, and 10), cycles 1, 5, and 9 fit the economic thesis above, and cycle 11 (which is the current rate cutting cycle), stands as an outlier. Of course, the book is not yet finished for cycle 11, and we could certainly see a bear market in stocks over the next several years.\nBonds (TLT) Next, we\u0026rsquo;ll run a similar process for long term bonds using TLT as the proxy.\nFirst, we pull data with the following:\n1 2 3 4 5 6 7 8 9 10 11 12 # Set decimal places pandas_set_decimal_places(2) yf_pull_data( base_directory=DATA_DIR, ticker=\u0026#34;TLT\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Exchange_Traded_Funds\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) And then load data with the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 tlt = load_data( base_directory=DATA_DIR, ticker=\u0026#34;TLT\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Exchange_Traded_Funds\u0026#34;, timeframe=\u0026#34;Daily\u0026#34;, file_format=\u0026#34;pickle\u0026#34;, ) # Filter TLT to date range tlt = tlt[(tlt.index \u0026gt;= pd.to_datetime(start_date)) \u0026amp; (tlt.index \u0026lt;= pd.to_datetime(end_date))] # Resample to monthly frequency tlt_monthly = tlt.resample(\u0026#34;M\u0026#34;).last() tlt_monthly[\u0026#34;Monthly_Return\u0026#34;] = tlt_monthly[\u0026#34;Close\u0026#34;].pct_change() Gives us the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 252 entries, 2004-11-30 to 2025-10-31 Freq: ME Data columns (total 6 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Close 252 non-null float64 1 High 252 non-null float64 2 Low 252 non-null float64 3 Open 252 non-null float64 4 Volume 252 non-null int64 5 Monthly_Return 251 non-null float64 dtypes: float64(5), int64(1) memory usage: 13.8 KB The first 5 rows are:\nDate Close High Low Open Volume Monthly_Return 2004-11-30 00:00:00 43.97 44.08 43.81 43.97 1754500.00 nan 2004-12-31 00:00:00 45.14 45.19 45.01 45.05 1056400.00 0.03 2005-01-31 00:00:00 46.75 46.77 46.53 46.55 1313900.00 0.04 2005-02-28 00:00:00 46.06 46.61 45.99 46.61 2797300.00 -0.01 2005-03-31 00:00:00 45.85 45.88 45.61 45.78 2410900.00 -0.00 The last 5 rows are:\nDate Close High Low Open Volume Monthly_Return 2025-06-30 00:00:00 86.33 86.52 85.71 85.95 53695200.00 0.03 2025-07-31 00:00:00 85.35 85.83 85.27 85.55 49814100.00 -0.01 2025-08-31 00:00:00 85.36 85.61 85.20 85.52 41686400.00 0.00 2025-09-30 00:00:00 88.42 89.09 88.27 88.71 38584000.00 0.04 2025-10-31 00:00:00 89.64 90.01 89.56 89.91 38247300.00 0.01 Next, we can plot the price history before calculating the cycle performance:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 plot_timeseries( price_df=tlt, plot_start_date=start_date, plot_end_date=end_date, plot_columns=[\u0026#34;Close\u0026#34;], title=\u0026#34;TLT Close Price\u0026#34;, x_label=\u0026#34;Date\u0026#34;, x_format=\u0026#34;Year\u0026#34;, y_label=\u0026#34;Price ($)\u0026#34;, y_format=\u0026#34;Decimal\u0026#34;, y_format_decimal_places=0, y_tick_spacing=10, grid=True, legend=False, export_plot=True, plot_file_name=\u0026#34;03_TLT_Price\u0026#34;, ) Next, we will calculate the performance for SPY based on the pre-defined Fed cycles:\n1 2 3 4 5 6 tlt_cycle_df = calc_fed_cycle_asset_performance( fed_cycles=fed_cycles, cycle_labels=cycle_labels, fed_changes=fed_changes, monthly_df=tlt_monthly, ) Which gives us:\nCycle Start End Months CumulativeReturn CumulativeReturnPct AverageMonthlyReturn AverageMonthlyReturnPct AnnualizedReturn AnnualizedReturnPct Volatility FedFundsChange FedFundsChange_bps FFR_AnnualizedChange FFR_AnnualizedChange_bps Label 0 Cycle 1 2004-11-01 2006-07-01 20 0.04 4.23 0.00 0.25 0.03 2.51 0.09 0.03 331.00 0.02 198.60 Cycle 1, 2004-11-01 to 2006-07-01 1 Cycle 2 2006-07-01 2007-07-01 12 0.06 5.76 0.00 0.49 0.06 5.76 0.07 0.00 2.00 0.00 2.00 Cycle 2, 2006-07-01 to 2007-07-01 2 Cycle 3 2007-07-01 2008-12-01 17 0.32 32.42 0.02 1.73 0.22 21.92 0.14 -0.05 -510.00 -0.04 -360.00 Cycle 3, 2007-07-01 to 2008-12-01 3 Cycle 4 2008-12-01 2015-11-01 83 0.46 45.67 0.01 0.55 0.06 5.59 0.15 -0.00 -4.00 -0.00 -0.58 Cycle 4, 2008-12-01 to 2015-11-01 4 Cycle 5 2015-11-01 2019-01-01 38 0.07 7.42 0.00 0.23 0.02 2.29 0.10 0.02 228.00 0.01 72.00 Cycle 5, 2015-11-01 to 2019-01-01 5 Cycle 6 2019-01-01 2019-07-01 6 0.10 10.48 0.02 1.73 0.22 22.05 0.13 0.00 0.00 0.00 0.00 Cycle 6, 2019-01-01 to 2019-07-01 6 Cycle 7 2019-07-01 2020-04-01 9 0.26 26.18 0.03 2.73 0.36 36.34 0.18 -0.02 -235.00 -0.03 -313.33 Cycle 7, 2019-07-01 to 2020-04-01 7 Cycle 8 2020-04-01 2022-02-01 22 -0.11 -11.33 -0.00 -0.50 -0.06 -6.35 0.11 0.00 3.00 0.00 1.64 Cycle 8, 2020-04-01 to 2022-02-01 8 Cycle 9 2022-02-01 2023-08-01 18 -0.27 -26.96 -0.02 -1.62 -0.19 -18.90 0.17 0.05 525.00 0.03 350.00 Cycle 9, 2022-02-01 to 2023-08-01 9 Cycle 10 2023-08-01 2024-08-01 12 -0.02 -1.52 0.00 0.02 -0.02 -1.52 0.20 0.00 0.00 0.00 0.00 Cycle 10, 2023-08-01 to 2024-08-01 10 Cycle 11 2024-08-01 2025-12-09 15 0.00 0.42 0.00 0.08 0.00 0.33 0.11 -0.01 -145.00 -0.01 -116.00 Cycle 11, 2024-08-01 to 2025-12-09 This gives us the following data points:\nCycle start date Cycle end date Number of months in the cycle Cumulative return during the cycle (decimal and percent) Average monthly return during the cycle (decimal and percent) Annualized return during the cycle (decimal and percent) Return volatility during the cycle Cumulative change in FFR during the cycle (decimal and basis points) Annualized change in FFR during the cycle (decimal and basis points) From the above DataFrame, we can then plot the cumulative and annualized returns for each cycle in a bar chart. First, the cumulative returns:\nAnd then the annualized returns:\nLet\u0026rsquo;s focus our analysis on the plot comparing the annualized returns for TLT to the change in FFR. We can see that during cycles 3 and 7, the returns were very strong along with a rapid pace in cutting rates. During cycle 9, we see the opposite behavior, where as rates were increased the bond returns were very poor. The question for cycle 11, where bond returns have been essentially flat - is the pace of rate cuts not significant enough to benefit the bond market? Are there other factors at play that are influencing the long term bond returns? Keep in mind that we are also working with 20 year treasuries as well, but we could consider running analysis on investment grade or high yield corporate bonds.\nFinally, we can run an OLS regression with the following code:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 df = tlt_cycle_df #################################### ### Don\u0026#39;t modify below this line ### #################################### # Run OLS regression with statsmodels X = df[\u0026#34;FFR_AnnualizedChange_bps\u0026#34;] y = df[\u0026#34;AnnualizedReturnPct\u0026#34;] X = sm.add_constant(X) model = sm.OLS(y, X).fit() print(model.summary()) print(f\u0026#34;Intercept: {model.params[0]}, Slope: {model.params[1]}\u0026#34;) # Intercept and slope # Calc X and Y values for regression line X_vals = np.linspace(X.min(), X.max(), 100) Y_vals = model.params[0] + model.params[1] * X_vals Which gives us the results of the OLS regression:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 OLS Regression Results =============================================================================== Dep. Variable: AnnualizedReturnPct R-squared: 0.623 Model: OLS Adj. R-squared: 0.582 Method: Least Squares F-statistic: 14.90 Date: Tue, 09 Dec 2025 Prob (F-statistic): 0.00385 Time: 01:22:34 Log-Likelihood: -39.665 No. Observations: 11 AIC: 83.33 Df Residuals: 9 BIC: 84.13 Df Model: 1 Covariance Type: nonrobust ============================================================================================ coef std err t P\u0026gt;|t| [0.025 0.975] -------------------------------------------------------------------------------------------- const 5.4676 2.978 1.836 0.100 -1.270 12.205 FFR_AnnualizedChange_bps -0.0597 0.015 -3.860 0.004 -0.095 -0.025 ============================================================================== Omnibus: 0.710 Durbin-Watson: 1.219 Prob(Omnibus): 0.701 Jarque-Bera (JB): 0.663 Skew: 0.412 Prob(JB): 0.718 Kurtosis: 2.123 Cond. No. 193. ============================================================================== Notes: [1] Standard Errors assume that the covariance matrix of the errors is correctly specified. And then plot the regression line along with the values:\n1 2 3 4 5 6 7 8 9 plot_scatter_regression_ffr_vs_returns( cycle_df=tlt_cycle_df, asset_label=\u0026#34;TLT\u0026#34;, index_num=\u0026#34;03\u0026#34;, x_vals=X_vals, y_vals=Y_vals, intercept=model.params[0], slope=model.params[1], ) Which gives us:\nThe above plot is intriguing because of how well the OLS regression appears to fit the data. It certainly appears that during rate-cutting cycles, bonds are an asset that performs well.\nGold (GLD) Lastly, we\u0026rsquo;ll look at the returns on gold, using the GLD ETF as a proxy.\nFirst, we pull data with the following:\n1 2 3 4 5 6 7 8 9 10 11 12 # Set decimal places pandas_set_decimal_places(2) yf_pull_data( base_directory=DATA_DIR, ticker=\u0026#34;GLD\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Exchange_Traded_Funds\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) And then load data with the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 gld = load_data( base_directory=DATA_DIR, ticker=\u0026#34;GLD\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Exchange_Traded_Funds\u0026#34;, timeframe=\u0026#34;Daily\u0026#34;, file_format=\u0026#34;pickle\u0026#34;, ) # Filter GLD to date range gld = gld[(gld.index \u0026gt;= pd.to_datetime(start_date)) \u0026amp; (gld.index \u0026lt;= pd.to_datetime(end_date))] # Resample to monthly frequency gld_monthly = gld.resample(\u0026#34;M\u0026#34;).last() gld_monthly[\u0026#34;Monthly_Return\u0026#34;] = gld_monthly[\u0026#34;Close\u0026#34;].pct_change() Gives us the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 252 entries, 2004-11-30 to 2025-10-31 Freq: ME Data columns (total 6 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Close 252 non-null float64 1 High 252 non-null float64 2 Low 252 non-null float64 3 Open 252 non-null float64 4 Volume 252 non-null int64 5 Monthly_Return 251 non-null float64 dtypes: float64(5), int64(1) memory usage: 13.8 KB The first 5 rows are:\nDate Close High Low Open Volume Monthly_Return 2004-11-30 00:00:00 45.12 45.41 44.82 45.37 3857200.00 nan 2004-12-31 00:00:00 43.80 43.94 43.73 43.85 531600.00 -0.03 2005-01-31 00:00:00 42.22 42.30 41.96 42.21 1692400.00 -0.04 2005-02-28 00:00:00 43.53 43.74 43.52 43.68 755300.00 0.03 2005-03-31 00:00:00 42.82 42.87 42.70 42.87 1363200.00 -0.02 The last 5 rows are:\nDate Close High Low Open Volume Monthly_Return 2025-06-30 00:00:00 304.83 304.92 301.95 302.39 8192100.00 0.00 2025-07-31 00:00:00 302.96 304.61 302.86 304.59 8981000.00 -0.01 2025-08-31 00:00:00 318.07 318.09 314.64 314.72 15642600.00 0.05 2025-09-30 00:00:00 355.47 355.57 350.87 351.13 13312400.00 0.12 2025-10-31 00:00:00 368.12 370.66 365.50 370.47 11077900.00 0.04 Next, we can plot the price history before calculating the cycle performance:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 plot_timeseries( price_df=gld, plot_start_date=start_date, plot_end_date=end_date, plot_columns=[\u0026#34;Close\u0026#34;], title=\u0026#34;GLD Close Price\u0026#34;, x_label=\u0026#34;Date\u0026#34;, x_format=\u0026#34;Year\u0026#34;, y_label=\u0026#34;Price ($)\u0026#34;, y_format=\u0026#34;Decimal\u0026#34;, y_format_decimal_places=0, y_tick_spacing=25, grid=True, legend=False, export_plot=True, plot_file_name=\u0026#34;04_GLD_Price\u0026#34;, ) Next, we will calculate the performance for SPY based on the pre-defined Fed cycles:\n1 2 3 4 5 6 gld_cycle_df = calc_fed_cycle_asset_performance( fed_cycles=fed_cycles, cycle_labels=cycle_labels, fed_changes=fed_changes, monthly_df=gld_monthly, ) Which gives us:\nCycle Start End Months CumulativeReturn CumulativeReturnPct AverageMonthlyReturn AverageMonthlyReturnPct AnnualizedReturn AnnualizedReturnPct Volatility FedFundsChange FedFundsChange_bps FFR_AnnualizedChange FFR_AnnualizedChange_bps Label 0 Cycle 1 2004-11-01 2006-07-01 20 0.36 35.70 0.02 1.73 0.20 20.10 0.17 0.03 331.00 0.02 198.60 Cycle 1, 2004-11-01 to 2006-07-01 1 Cycle 2 2006-07-01 2007-07-01 12 0.05 4.96 0.00 0.45 0.05 4.96 0.11 0.00 2.00 0.00 2.00 Cycle 2, 2006-07-01 to 2007-07-01 2 Cycle 3 2007-07-01 2008-12-01 17 0.25 24.96 0.02 1.59 0.17 17.03 0.26 -0.05 -510.00 -0.04 -360.00 Cycle 3, 2007-07-01 to 2008-12-01 3 Cycle 4 2008-12-01 2015-11-01 83 0.36 36.10 0.01 0.51 0.05 4.56 0.18 -0.00 -4.00 -0.00 -0.58 Cycle 4, 2008-12-01 to 2015-11-01 4 Cycle 5 2015-11-01 2019-01-01 38 0.11 10.93 0.00 0.35 0.03 3.33 0.14 0.02 228.00 0.01 72.00 Cycle 5, 2015-11-01 to 2019-01-01 5 Cycle 6 2019-01-01 2019-07-01 6 0.10 9.86 0.02 1.63 0.21 20.68 0.12 0.00 0.00 0.00 0.00 Cycle 6, 2019-01-01 to 2019-07-01 6 Cycle 7 2019-07-01 2020-04-01 9 0.11 11.15 0.01 1.24 0.15 15.13 0.13 -0.02 -235.00 -0.03 -313.33 Cycle 7, 2019-07-01 to 2020-04-01 7 Cycle 8 2020-04-01 2022-02-01 22 0.14 13.54 0.01 0.69 0.07 7.17 0.16 0.00 3.00 0.00 1.64 Cycle 8, 2020-04-01 to 2022-02-01 8 Cycle 9 2022-02-01 2023-08-01 18 0.08 8.48 0.01 0.53 0.06 5.58 0.14 0.05 525.00 0.03 350.00 Cycle 9, 2022-02-01 to 2023-08-01 9 Cycle 10 2023-08-01 2024-08-01 12 0.24 24.24 0.02 1.89 0.24 24.24 0.13 0.00 0.00 0.00 0.00 Cycle 10, 2023-08-01 to 2024-08-01 10 Cycle 11 2024-08-01 2025-12-09 15 0.62 62.49 0.03 3.36 0.47 47.46 0.14 -0.01 -145.00 -0.01 -116.00 Cycle 11, 2024-08-01 to 2025-12-09 This gives us the following data points:\nCycle start date Cycle end date Number of months in the cycle Cumulative return during the cycle (decimal and percent) Average monthly return during the cycle (decimal and percent) Annualized return during the cycle (decimal and percent) Return volatility during the cycle Cumulative change in FFR during the cycle (decimal and basis points) Annualized change in FFR during the cycle (decimal and basis points) From the above DataFrame, we can then plot the cumulative and annualized returns for each cycle in a bar chart. First, the cumulative returns:\nAnd then the annualized returns:\nWe see strong returns for gold across several different Fed cycles, so it is difficult to draw any kind of initial conclusion based on the bar charts.\nFinally, we can run an OLS regression with the following code:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 df = gld_cycle_df #################################### ### Don\u0026#39;t modify below this line ### #################################### # Run OLS regression with statsmodels X = df[\u0026#34;FFR_AnnualizedChange_bps\u0026#34;] y = df[\u0026#34;AnnualizedReturnPct\u0026#34;] X = sm.add_constant(X) model = sm.OLS(y, X).fit() print(model.summary()) print(f\u0026#34;Intercept: {model.params[0]}, Slope: {model.params[1]}\u0026#34;) # Intercept and slope # Calc X and Y values for regression line X_vals = np.linspace(X.min(), X.max(), 100) Y_vals = model.params[0] + model.params[1] * X_vals Which gives us the results of the OLS regression:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 OLS Regression Results =============================================================================== Dep. Variable: AnnualizedReturnPct R-squared: 0.084 Model: OLS Adj. R-squared: -0.018 Method: Least Squares F-statistic: 0.8274 Date: Tue, 09 Dec 2025 Prob (F-statistic): 0.387 Time: 01:22:37 Log-Likelihood: -42.830 No. Observations: 11 AIC: 89.66 Df Residuals: 9 BIC: 90.46 Df Model: 1 Covariance Type: nonrobust ============================================================================================ coef std err t P\u0026gt;|t| [0.025 0.975] -------------------------------------------------------------------------------------------- const 15.1947 3.972 3.826 0.004 6.210 24.179 FFR_AnnualizedChange_bps -0.0187 0.021 -0.910 0.387 -0.065 0.028 ============================================================================== Omnibus: 8.035 Durbin-Watson: 0.915 Prob(Omnibus): 0.018 Jarque-Bera (JB): 3.686 Skew: 1.328 Prob(JB): 0.158 Kurtosis: 3.993 Cond. No. 193. ============================================================================== Notes: [1] Standard Errors assume that the covariance matrix of the errors is correctly specified. And then plot the regression line along with the values:\n1 2 3 4 5 6 7 8 9 plot_scatter_regression_ffr_vs_returns( cycle_df=gld_cycle_df, asset_label=\u0026#34;GLD\u0026#34;, index_num=\u0026#34;04\u0026#34;, x_vals=X_vals, y_vals=Y_vals, intercept=model.params[0], slope=model.params[1], ) Which gives us:\nIt\u0026rsquo;s difficult to draw any strong conclusions with the above plot. Gold has traditionally been considered a hedge for inflation, and while one of the Fed\u0026rsquo;s mandates is to manage inflation, there may not be a conclusion to draw in relationship to the historical returns that gold has exhibited.\nHybrid Portfolio With the above analysis (somewhat) complete, let\u0026rsquo;s look at the optimal allocation for a portfolio based on the data and the hypythetical historical results.\nRecall the plots for annualized returns vs annualized change in FFR for stocks, bonds, and gold:\nAsset Allocation We have to be careful with our criteria for when to hold stocks, bonds, or gold, as hindsight bias is certainly possible. So, without overanalyzing the results, let\u0026rsquo;s assume that we hold stocks as the default position, and then hold bonds when the Fed starts cutting rates, and then resume holding stocks when the Fed stops cutting rates. If there is not any change in FFR, then we still hold stocks. That gives us:\nCycle 1: Stocks Cycle 2: Stocks Cycle 3: Bonds Cycle 4: Stocks Cycle 5: Stocks Cycle 6: Stocks Cycle 7: Bonds Cycle 8: Stocks Cycle 9: Stocks Cycle 10: Stocks Cycle 11: Bonds We can then combine the return series based on the above with the following code:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 # Calculate cumulative returns and drawdown for SPY spy_monthly[\u0026#39;Cumulative_Return\u0026#39;] = (1 + spy_monthly[\u0026#39;Monthly_Return\u0026#39;]).cumprod() - 1 spy_monthly[\u0026#39;Cumulative_Return_Plus_One\u0026#39;] = 1 + spy_monthly[\u0026#39;Cumulative_Return\u0026#39;] spy_monthly[\u0026#39;Rolling_Max\u0026#39;] = spy_monthly[\u0026#39;Cumulative_Return_Plus_One\u0026#39;].cummax() spy_monthly[\u0026#39;Drawdown\u0026#39;] = spy_monthly[\u0026#39;Cumulative_Return_Plus_One\u0026#39;] / spy_monthly[\u0026#39;Rolling_Max\u0026#39;] - 1 spy_monthly.drop(columns=[\u0026#39;Cumulative_Return_Plus_One\u0026#39;, \u0026#39;Rolling_Max\u0026#39;], inplace=True) # Calculate cumulative returns and drawdown for TLT tlt_monthly[\u0026#39;Cumulative_Return\u0026#39;] = (1 + tlt_monthly[\u0026#39;Monthly_Return\u0026#39;]).cumprod() - 1 tlt_monthly[\u0026#39;Cumulative_Return_Plus_One\u0026#39;] = 1 + tlt_monthly[\u0026#39;Cumulative_Return\u0026#39;] tlt_monthly[\u0026#39;Rolling_Max\u0026#39;] = tlt_monthly[\u0026#39;Cumulative_Return_Plus_One\u0026#39;].cummax() tlt_monthly[\u0026#39;Drawdown\u0026#39;] = tlt_monthly[\u0026#39;Cumulative_Return_Plus_One\u0026#39;] / tlt_monthly[\u0026#39;Rolling_Max\u0026#39;] - 1 tlt_monthly.drop(columns=[\u0026#39;Cumulative_Return_Plus_One\u0026#39;, \u0026#39;Rolling_Max\u0026#39;], inplace=True) # Isolate the returns for SPY and TLT spy_ret = spy_monthly[\u0026#39;Monthly_Return\u0026#39;] tlt_ret = tlt_monthly[\u0026#39;Monthly_Return\u0026#39;] # Create a blended portfolio based on Fed policy cycles portfolio = ( spy_ret[spy_ret.index \u0026lt;= \u0026#34;2007-07-01\u0026#34;] .combine_first(tlt_ret[(tlt_ret.index \u0026gt;= \u0026#34;2007-07-01\u0026#34;) \u0026amp; (tlt_ret.index \u0026lt;= \u0026#34;2008-12-01\u0026#34;)]) .combine_first(spy_ret[(spy_ret.index \u0026gt; \u0026#34;2008-12-01\u0026#34;) \u0026amp; (spy_ret.index \u0026lt;= \u0026#34;2019-07-01\u0026#34;)]) .combine_first(tlt_ret[(tlt_ret.index \u0026gt;= \u0026#34;2019-07-01\u0026#34;) \u0026amp; (tlt_ret.index \u0026lt;= \u0026#34;2020-04-01\u0026#34;)]) .combine_first(spy_ret[(spy_ret.index \u0026gt; \u0026#34;2020-04-01\u0026#34;) \u0026amp; (spy_ret.index \u0026lt;= \u0026#34;2024-08-01\u0026#34;)]) .combine_first(tlt_ret[tlt_ret.index \u0026gt; \u0026#34;2024-08-01\u0026#34;]) ) # Convert to DataFrame portfolio_monthly = portfolio.to_frame(name=\u0026#34;Portfolio_Monthly_Return\u0026#34;) # Calculate cumulative returns and drawdown for the portfolio portfolio_monthly[\u0026#39;Portfolio_Cumulative_Return\u0026#39;] = (1 + portfolio_monthly[\u0026#39;Portfolio_Monthly_Return\u0026#39;]).cumprod() - 1 portfolio_monthly[\u0026#39;Portfolio_Cumulative_Return_Plus_One\u0026#39;] = 1 + portfolio_monthly[\u0026#39;Portfolio_Cumulative_Return\u0026#39;] portfolio_monthly[\u0026#39;Portfolio_Rolling_Max\u0026#39;] = portfolio_monthly[\u0026#39;Portfolio_Cumulative_Return_Plus_One\u0026#39;].cummax() portfolio_monthly[\u0026#39;Portfolio_Drawdown\u0026#39;] = portfolio_monthly[\u0026#39;Portfolio_Cumulative_Return_Plus_One\u0026#39;] / portfolio_monthly[\u0026#39;Portfolio_Rolling_Max\u0026#39;] - 1 portfolio_monthly.drop(columns=[\u0026#39;Portfolio_Cumulative_Return_Plus_One\u0026#39;, \u0026#39;Portfolio_Rolling_Max\u0026#39;], inplace=True) # Merge \u0026#34;spy_monthly\u0026#34; and \u0026#34;tlt_monthly\u0026#34; into \u0026#34;portfolio_monthly\u0026#34; to compare cumulative returns portfolio_monthly = portfolio_monthly.join( spy_monthly[\u0026#39;Monthly_Return\u0026#39;].rename(\u0026#39;SPY_Monthly_Return\u0026#39;), how=\u0026#39;left\u0026#39; ).join( spy_monthly[\u0026#39;Cumulative_Return\u0026#39;].rename(\u0026#39;SPY_Cumulative_Return\u0026#39;), how=\u0026#39;left\u0026#39; ).join( spy_monthly[\u0026#39;Drawdown\u0026#39;].rename(\u0026#39;SPY_Drawdown\u0026#39;), how=\u0026#39;left\u0026#39; ).join( tlt_monthly[\u0026#39;Monthly_Return\u0026#39;].rename(\u0026#39;TLT_Monthly_Return\u0026#39;), how=\u0026#39;left\u0026#39; ).join( tlt_monthly[\u0026#39;Cumulative_Return\u0026#39;].rename(\u0026#39;TLT_Cumulative_Return\u0026#39;), how=\u0026#39;left\u0026#39; ).join( tlt_monthly[\u0026#39;Drawdown\u0026#39;].rename(\u0026#39;TLT_Drawdown\u0026#39;), how=\u0026#39;left\u0026#39; ) Which gives us:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 252 entries, 2004-11-30 to 2025-10-31 Freq: ME Data columns (total 9 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Portfolio_Monthly_Return 251 non-null float64 1 Portfolio_Cumulative_Return 251 non-null float64 2 Portfolio_Drawdown 251 non-null float64 3 SPY_Monthly_Return 251 non-null float64 4 SPY_Cumulative_Return 251 non-null float64 5 SPY_Drawdown 251 non-null float64 6 TLT_Monthly_Return 251 non-null float64 7 TLT_Cumulative_Return 251 non-null float64 8 TLT_Drawdown 251 non-null float64 dtypes: float64(9) memory usage: 19.7 KB The first 5 rows are:\nDate Portfolio_Monthly_Return Portfolio_Cumulative_Return Portfolio_Drawdown SPY_Monthly_Return SPY_Cumulative_Return SPY_Drawdown TLT_Monthly_Return TLT_Cumulative_Return TLT_Drawdown 2004-11-30 00:00:00 nan nan nan nan nan nan nan nan nan 2004-12-31 00:00:00 0.030 0.030 0.000 0.030 0.030 0.000 0.027 0.027 0.000 2005-01-31 00:00:00 -0.022 0.007 -0.022 -0.022 0.007 -0.022 0.036 0.063 0.000 2005-02-28 00:00:00 0.021 0.028 -0.002 0.021 0.028 -0.002 -0.015 0.048 -0.015 2005-03-31 00:00:00 -0.018 0.009 -0.020 -0.018 0.009 -0.020 -0.005 0.043 -0.019 The last 5 rows are:\nDate Portfolio_Monthly_Return Portfolio_Cumulative_Return Portfolio_Drawdown SPY_Monthly_Return SPY_Cumulative_Return SPY_Drawdown TLT_Monthly_Return TLT_Cumulative_Return TLT_Drawdown 2025-06-30 00:00:00 0.027 19.004 -0.072 0.051 6.718 0.000 0.027 0.963 -0.408 2025-07-31 00:00:00 -0.011 18.776 -0.082 0.023 6.896 0.000 -0.011 0.941 -0.415 2025-08-31 00:00:00 0.000 18.778 -0.082 0.021 7.058 0.000 0.000 0.941 -0.415 2025-09-30 00:00:00 0.036 19.489 -0.049 0.036 7.345 0.000 0.036 1.011 -0.394 2025-10-31 00:00:00 0.014 19.772 -0.036 0.024 7.544 0.000 0.014 1.039 -0.385 Next, we\u0026rsquo;ll look at performance for the assets and portfolio.\nPerformance Statistics We can then plot the monthly returns:\nAnd cumulative returns:\nAnd drawdowns:\nFinally, we can run the stats on the hybrid portfolio, SPY, and TLT with the following code:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 port_sum_stats = summary_stats( fund_list=[\u0026#34;Portfolio\u0026#34;, \u0026#34;SPY\u0026#34;, \u0026#34;TLT\u0026#34;], df=portfolio_monthly[[\u0026#34;Portfolio_Monthly_Return\u0026#34;]], period=\u0026#34;Monthly\u0026#34;, use_calendar_days=False, excel_export=False, pickle_export=False, output_confirmation=False, ) spy_sum_stats = summary_stats( fund_list=[\u0026#34;Portfolio\u0026#34;, \u0026#34;SPY\u0026#34;, \u0026#34;TLT\u0026#34;], df=portfolio_monthly[[\u0026#34;SPY_Monthly_Return\u0026#34;]], period=\u0026#34;Monthly\u0026#34;, use_calendar_days=False, excel_export=False, pickle_export=False, output_confirmation=False, ) tlt_sum_stats = summary_stats( fund_list=[\u0026#34;Portfolio\u0026#34;, \u0026#34;SPY\u0026#34;, \u0026#34;TLT\u0026#34;], df=portfolio_monthly[[\u0026#34;TLT_Monthly_Return\u0026#34;]], period=\u0026#34;Monthly\u0026#34;, use_calendar_days=False, excel_export=False, pickle_export=False, output_confirmation=False, ) sum_stats = port_sum_stats.combine_first(spy_sum_stats).combine_first(tlt_sum_stats) Which gives us:\nAnnualized Mean Annualized Volatility Annualized Sharpe Ratio CAGR Monthly Max Return Monthly Max Return (Date) Monthly Min Return Monthly Min Return (Date) Max Drawdown Peak Trough Recovery Date Days to Recover MAR Ratio Portfolio_Monthly_Return 0.156 0.140 1.111 0.155 0.143 2008-11-30 00:00:00 -0.107 2009-02-28 00:00:00 -0.239 2021-12-31 00:00:00 2022-09-30 00:00:00 2023-12-31 00:00:00 457.000 0.650 SPY_Monthly_Return 0.114 0.148 0.769 0.108 0.127 2020-04-30 00:00:00 -0.165 2008-10-31 00:00:00 -0.508 2007-10-31 00:00:00 2009-02-28 00:00:00 2012-03-31 00:00:00 1127.000 0.212 TLT_Monthly_Return 0.043 0.137 0.316 0.035 0.143 2008-11-30 00:00:00 -0.131 2009-01-31 00:00:00 -0.476 2020-07-31 00:00:00 2023-10-31 00:00:00 NaT nan 0.072 Based on the above, our hybrid portfolio outperforms both stocks and bonds, and by a wide margin.\nFuture Investigation A couple of ideas sound intriguing for future investigation:\nDo investment grade or high yield bonds show a different behavior than the long term US treasury bonds? Does a commodity index (such as GSCI) exhibit differing behavior than gold? How does leverage affect the returns that are observed for the hybrid portfolio, stocks, and bonds? Do other Fed tightening/loosening cycles exhibit the same behavior for returns? References https://fred.stlouisfed.org/series/FEDFUNDS Code The jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2025-11-29T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2025/11/29/asset-class-performance-fed-policy-cycles/asset-class-performance-fed-policy-cycles.png","permalink":"https://www.jaredszajkowski.com/stack/2025/11/29/asset-class-performance-fed-policy-cycles/","title":"Performance Of Various Asset Classes During Fed Policy Cycles"},{"content":"Introduction Similar to the recent post about how I collect and store crypto asset data from Coinbase, the scripts below pull minute, hour, and daily data for equities and ETFs from Polygon.io.\nThe scripts check for an existing data record, and if found then the existing record is updated to include the most recent data. If there is not an existing data record, then the complete historical record from Polygon is pulled and stored.\nPython Functions Here are the functions needed for this project:\npolygon_fetch_full_history: Fetch full historical data for a given product from Polygon API. polygon_pull_data: Read existing data file, download price data from Polygon, and export data. Function Usage Polygon Fetch Full History Here\u0026rsquo;s the docstring with the parameters/variables:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 \u0026#34;\u0026#34;\u0026#34; Fetch full historical data for a given product from Polygon API. Parameters: ----------- client Polygon API client instance. ticker : str Ticker symbol to download. timespan : str Time span for the data (e.g., \u0026#34;minute\u0026#34;, \u0026#34;hour\u0026#34;, \u0026#34;day\u0026#34;, \u0026#34;week\u0026#34;, \u0026#34;month\u0026#34;, \u0026#34;quarter\u0026#34;, \u0026#34;year\u0026#34;). multiplier : int Multiplier for the time span (e.g., 1 for daily data). adjusted : bool If True, return adjusted data; if False, return raw data. full_history_df : pd.DataFrame DataFrame containing the data. current_start : datetime Date for which to start pulling data in datetime format. free_tier : bool If True, then pause to avoid API limits. verbose : bool If True, print detailed information about the data being processed. Returns: -------- full_history_df : pd.DataFrame DataFrame containing the data. \u0026#34;\u0026#34;\u0026#34; This script pulls the full history for a specified asset:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 from load_api_keys import load_api_keys from polygon import RESTClient # Load API keys from the environment api_keys = load_api_keys() # Get the environment variable for where data is stored DATA_DIR = config(\u0026#34;DATA_DIR\u0026#34;) # Open client connection client = RESTClient(api_key=api_keys[\u0026#34;POLYGON_KEY\u0026#34;]) # Create an empty DataFrame df = pd.DataFrame({ \u0026#39;Date\u0026#39;: pd.Series(dtype=\u0026#34;datetime64[ns]\u0026#34;), \u0026#39;open\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;high\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;low\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;close\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;volume\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;vwap\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;transactions\u0026#39;: pd.Series(dtype=\u0026#34;int64\u0026#34;), \u0026#39;otc\u0026#39;: pd.Series(dtype=\u0026#34;object\u0026#34;) }) # Example usage - minute df = polygon_fetch_full_history( client=client, ticker=\u0026#34;AMZN\u0026#34;, timespan=\u0026#34;day\u0026#34;, multiplier=1, adjusted=True, full_history_df=df, current_start=datetime(2025, 1, 1), free_tier=True, verbose=True, ) The example above pulls the daily data since 1/1/2025, but can handle data ranges of years because it pulls only a specific number of records at a time as recommended by Polygon (less than 5,000 records per API request), and then combines the records in the dataframe before returning the dataframe.\nDate open high low close volume vwap transactions otc 0 2025-01-02 05:00:00 222.03000 225.15000 218.19000 220.22000 33956579.00000 221.27450 449631 1 2025-01-03 05:00:00 222.50500 225.36000 221.62000 224.19000 27515606.00000 223.70500 346976 2 2025-01-06 05:00:00 226.78000 228.83500 224.84000 227.61000 31849831.00000 227.09210 410686 3 2025-01-07 05:00:00 227.90000 228.38100 221.46000 222.11000 28084164.00000 223.40330 379570 4 2025-01-08 05:00:00 223.18500 223.52000 220.20000 222.13000 25033292.00000 222.04140 325539 5 2025-01-10 05:00:00 221.46000 221.71000 216.50000 218.94000 36811525.00000 218.86400 493840 6 2025-01-13 05:00:00 218.06000 219.40000 216.47000 218.46000 27262655.00000 218.14260 373519 7 2025-01-14 05:00:00 220.44000 221.82000 216.20000 217.76000 24711650.00000 218.62450 332022 8 2025-01-15 05:00:00 222.83000 223.57000 220.75000 223.35000 31291257.00000 222.66900 353985 9 2025-01-16 05:00:00 224.42000 224.65000 220.31000 220.66000 24757276.00000 221.89420 313323 10 2025-01-17 05:00:00 225.84000 226.51000 223.08000 225.94000 42370123.00000 225.39270 385914 11 2025-01-21 05:00:00 228.90000 231.78000 226.94000 230.71000 39951456.00000 230.09010 552447 12 2025-01-22 05:00:00 232.02000 235.44000 231.19000 235.01000 41448217.00000 234.09500 512233 13 2025-01-23 05:00:00 234.10000 235.52000 231.51000 235.42000 26404364.00000 234.24350 365153 14 2025-01-24 05:00:00 234.50000 236.40000 232.93000 234.85000 25890738.00000 234.45870 349378 15 2025-01-27 05:00:00 226.21000 235.61000 225.86000 235.42000 49428332.00000 231.81880 661486 16 2025-01-28 05:00:00 234.29000 241.77000 233.98000 238.15000 41587188.00000 238.63650 542344 17 2025-01-29 05:00:00 239.01500 240.39000 236.15000 237.07000 26091716.00000 237.41410 384889 18 2025-01-30 05:00:00 237.14000 237.95000 232.22000 234.64000 32020728.00000 235.02300 428122 19 2025-01-31 05:00:00 236.50000 240.29000 236.41000 237.68000 36162377.00000 238.20190 435388 20 2025-02-03 05:00:00 234.06000 239.25000 232.90000 237.42000 37285868.00000 236.60880 551855 21 2025-02-04 05:00:00 239.01000 242.52000 238.03000 242.06000 29713812.00000 241.16980 412003 22 2025-02-05 05:00:00 237.02000 238.32000 235.20000 236.17000 38832042.00000 236.57200 552144 23 2025-02-06 05:00:00 238.01000 239.65990 236.01000 238.83000 60897095.00000 235.75270 803119 24 2025-02-07 05:00:00 232.50000 234.81000 228.06000 229.15000 77539276.00000 230.45270 943441 25 2025-02-10 05:00:00 230.54500 233.92000 229.20000 233.14000 35419926.00000 232.41510 473182 26 2025-02-11 05:00:00 231.92000 233.44000 230.13000 232.76000 23713726.00000 232.11010 329473 27 2025-02-12 05:00:00 230.46000 231.18000 228.16000 228.93000 32285249.00000 229.60790 400325 28 2025-02-13 05:00:00 228.85000 230.42000 227.52000 230.37000 31346512.00000 229.33120 408846 29 2025-02-14 05:00:00 229.20000 229.89000 227.23000 228.68000 27031084.00000 228.65430 375369 30 2025-02-18 05:00:00 228.82000 229.30000 223.72000 226.65000 42975133.00000 225.80820 600702 31 2025-02-19 05:00:00 225.52000 226.83000 223.71000 226.63000 28566709.00000 225.37730 391423 32 2025-02-20 05:00:00 224.77500 225.13000 221.81000 222.88000 30001665.00000 222.93510 447333 33 2025-02-21 05:00:00 223.28000 223.31000 214.74000 216.58000 55323850.00000 217.71220 774966 34 2025-02-24 05:00:00 217.45000 217.71500 212.42000 212.71000 42387585.00000 214.05260 586136 35 2025-02-25 05:00:00 211.63000 213.34000 204.16000 212.80000 58957977.00000 209.69830 800750 36 2025-02-26 05:00:00 214.94000 218.16000 213.09000 214.35000 39120603.00000 215.44180 508442 37 2025-02-27 05:00:00 218.35000 219.97000 208.37000 208.74000 40548571.00000 212.37250 558788 38 2025-02-28 05:00:00 208.65000 212.62000 206.99000 212.28000 51771737.00000 210.40300 540930 39 2025-03-03 05:00:00 213.35200 214.01000 202.55000 205.02000 42948447.00000 207.41030 666781 40 2025-03-04 05:00:00 200.11000 206.80000 197.43200 203.80000 60853084.00000 201.92830 872022 41 2025-03-05 05:00:00 204.80000 209.98000 203.26000 208.36000 38610085.00000 207.05520 514659 42 2025-03-06 05:00:00 204.40000 205.77000 198.30150 200.70000 49863755.00000 201.45630 688561 43 2025-03-07 05:00:00 199.49000 202.26530 192.53000 199.25000 59802821.00000 197.66190 785672 44 2025-03-10 04:00:00 195.60000 196.73000 190.85000 194.54000 62350926.00000 193.58390 902822 45 2025-03-11 04:00:00 193.90000 200.18000 193.40000 196.59000 54002880.00000 196.82150 668549 46 2025-03-12 04:00:00 200.72000 201.52000 195.29000 198.89000 43679284.00000 198.88200 573896 47 2025-03-13 04:00:00 198.16500 198.87990 191.82000 193.89000 41270761.00000 194.33260 588538 48 2025-03-14 04:00:00 197.41000 198.65000 195.32000 197.95000 38096663.00000 197.43590 470644 49 2025-03-17 04:00:00 198.77000 199.00000 194.32470 195.74000 47341752.00000 196.46120 568556 50 2025-03-18 04:00:00 192.52000 194.00000 189.38000 192.82000 40414867.00000 192.08510 554394 51 2025-03-19 04:00:00 193.38000 195.96500 191.96000 195.54000 39442878.00000 194.28850 445698 52 2025-03-20 04:00:00 193.07000 199.32000 192.30000 194.95000 38921113.00000 195.63600 455391 53 2025-03-21 04:00:00 192.90000 196.99000 192.52000 196.21000 60056917.00000 195.30150 394836 54 2025-03-24 04:00:00 200.00000 203.64000 199.95000 203.26000 41625365.00000 202.36990 515880 55 2025-03-25 04:00:00 203.59500 206.21000 203.22000 205.71000 31171161.00000 205.10370 402368 56 2025-03-26 04:00:00 205.83500 206.01000 199.92500 201.13000 32990973.00000 202.21640 450177 57 2025-03-27 04:00:00 200.89000 203.79000 199.28210 201.36000 27317661.00000 201.85530 363321 58 2025-03-28 04:00:00 198.42000 199.26000 191.88100 192.72000 52548226.00000 193.98200 678645 59 2025-03-31 04:00:00 188.19000 191.33000 184.40000 190.26000 63547558.00000 188.28000 769862 60 2025-04-01 04:00:00 187.86000 193.93000 187.20000 192.17000 41267315.00000 191.47040 528168 61 2025-04-02 04:00:00 187.66000 198.34000 187.66000 196.01000 53679198.00000 194.28700 710850 62 2025-04-03 04:00:00 182.99500 184.13000 176.92000 178.41000 95553617.00000 180.34260 1447903 63 2025-04-04 04:00:00 167.14500 178.14360 166.00000 171.00000 123159359.00000 173.21620 1529094 64 2025-04-07 04:00:00 162.00000 183.40990 161.38000 175.26000 109327115.00000 172.66900 1420152 65 2025-04-08 04:00:00 185.23000 185.90000 168.57000 170.66000 87710360.00000 176.41280 1146023 66 2025-04-09 04:00:00 172.11500 192.65000 169.93000 191.10000 116804328.00000 182.33450 1310706 67 2025-04-10 04:00:00 185.44000 186.86920 175.85180 181.22000 68302045.00000 181.65100 925662 68 2025-04-11 04:00:00 179.93000 185.86000 178.00000 184.87000 50594339.00000 182.64530 638386 69 2025-04-14 04:00:00 186.84000 187.44000 179.23000 182.12000 48002540.00000 182.95010 656762 70 2025-04-15 04:00:00 181.41000 182.35000 177.93310 179.59000 43641952.00000 180.07010 564759 71 2025-04-16 04:00:00 176.29000 179.10460 171.41000 174.33000 51875316.00000 174.85240 685436 72 2025-04-17 04:00:00 176.00000 176.21000 172.00000 172.61000 44726453.00000 173.58080 527949 73 2025-04-21 04:00:00 169.60000 169.60000 165.28500 167.32000 48126111.00000 166.92550 726774 74 2025-04-22 04:00:00 169.84500 176.78000 169.35000 173.18000 56607202.00000 173.24700 617509 75 2025-04-23 04:00:00 183.45000 187.38000 180.19000 180.60000 63470149.00000 183.12080 732657 76 2025-04-24 04:00:00 180.91500 186.74000 180.18000 186.54000 43763196.00000 184.88010 516092 77 2025-04-25 04:00:00 187.62000 189.94000 185.49000 188.99000 36414330.00000 187.99630 489570 78 2025-04-28 04:00:00 190.10500 190.22000 184.88500 187.70000 33224732.00000 187.47010 452721 79 2025-04-29 04:00:00 183.99000 188.01580 183.68000 187.39000 41667255.00000 186.29110 507603 80 2025-04-30 04:00:00 182.17000 185.05000 178.85000 184.42000 55176543.00000 182.76580 688982 81 2025-05-01 04:00:00 190.63000 191.80711 187.50000 190.20000 74265963.00000 188.97670 890648 82 2025-05-02 04:00:00 191.43500 192.88000 186.40000 189.98000 77903487.00000 189.93000 920469 83 2025-05-05 04:00:00 186.51000 188.18000 185.53000 186.35000 35217469.00000 186.93730 458698 84 2025-05-06 04:00:00 184.57000 187.93000 183.85000 185.01000 29314055.00000 185.66190 371859 85 2025-05-07 04:00:00 185.56000 190.99000 185.01000 188.71000 44002926.00000 188.49640 499932 86 2025-05-08 04:00:00 191.43000 194.33000 188.82000 192.08000 41043620.00000 192.12660 513351 87 2025-05-09 04:00:00 193.37500 194.69000 191.16000 193.06000 29663143.00000 192.75640 362745 88 2025-05-12 04:00:00 210.71000 211.66000 205.75000 208.64000 75205042.00000 208.15580 920091 89 2025-05-13 04:00:00 211.08000 214.84000 210.10000 211.37000 56193682.00000 212.67920 743293 90 2025-05-14 04:00:00 211.45000 211.93000 208.85000 210.25000 38492128.00000 210.46140 519690 91 2025-05-15 04:00:00 206.45000 206.88000 202.67300 205.17000 64347317.00000 204.78920 821281 92 2025-05-16 04:00:00 206.85000 206.85000 204.37400 205.59000 43318478.00000 205.33360 490987 93 2025-05-19 04:00:00 201.64500 206.62000 201.26000 206.16000 34314810.00000 205.36920 425652 94 2025-05-20 04:00:00 204.62800 205.58990 202.65000 204.07000 29470373.00000 204.01080 417640 95 2025-05-21 04:00:00 201.61000 203.45500 200.06000 201.12000 42460924.00000 201.52350 569211 96 2025-05-22 04:00:00 201.38000 205.76000 200.16000 203.10000 38938882.00000 203.32730 514617 97 2025-05-23 04:00:00 198.90000 202.37000 197.85000 200.99000 33393545.00000 200.78910 483196 98 2025-05-27 04:00:00 203.08500 206.69000 202.19000 206.02000 34892044.00000 205.21560 505171 99 2025-05-28 04:00:00 205.91500 207.66000 204.41000 204.72000 28549753.00000 205.69780 407710 100 2025-05-29 04:00:00 208.02500 208.81000 204.23000 205.70000 34700005.00000 206.29430 496232 101 2025-05-30 04:00:00 204.84000 205.99000 201.69500 205.01000 51679406.00000 204.53180 493714 102 2025-06-02 04:00:00 204.98000 207.00000 202.68000 206.65000 29113319.00000 205.59400 438892 103 2025-06-03 04:00:00 207.10500 208.94690 205.03000 205.71000 33139121.00000 206.63430 436643 104 2025-06-04 04:00:00 206.55000 208.18000 205.18000 207.23000 29915592.00000 206.89200 406881 105 2025-06-05 04:00:00 209.55000 212.81000 207.56000 207.91000 51979243.00000 209.75510 668806 106 2025-06-06 04:00:00 212.40000 213.86990 210.50000 213.57000 39832500.00000 212.56720 512176 107 2025-06-09 04:00:00 214.75000 217.85000 212.88000 216.98000 38102502.00000 216.02540 560148 108 2025-06-10 04:00:00 216.78000 217.69000 214.15000 217.61000 31303317.00000 216.55610 429031 109 2025-06-11 04:00:00 217.41000 218.40000 212.89000 213.20000 39325981.00000 214.88400 503640 110 2025-06-12 04:00:00 211.78000 213.58000 211.33000 213.24000 27639991.00000 212.86550 364940 111 2025-06-13 04:00:00 209.96000 214.05000 209.62000 212.10000 29337763.00000 211.91290 443361 112 2025-06-16 04:00:00 212.31000 217.06000 211.60000 216.10000 33284158.00000 215.16220 457829 113 2025-06-17 04:00:00 215.19500 217.41000 214.56000 214.82000 32086262.00000 215.73320 431316 114 2025-06-18 04:00:00 215.09000 217.96000 212.34000 212.52000 44360509.00000 214.61350 475101 115 2025-06-20 04:00:00 214.68000 214.89000 208.27090 209.69000 75350733.00000 210.62300 591675 116 2025-06-23 04:00:00 209.79000 210.39000 207.31010 208.47000 37311725.00000 208.91300 513937 117 2025-06-24 04:00:00 212.13500 214.34000 211.04500 212.77000 38378757.00000 213.11290 459901 118 2025-06-25 04:00:00 214.61500 216.03000 211.11000 211.99000 31755698.00000 212.80660 417565 119 2025-06-26 04:00:00 213.12000 218.03500 212.01000 217.12000 50480814.00000 216.09440 572822 120 2025-06-27 04:00:00 219.92000 223.30000 216.74000 223.30000 119217138.00000 221.72670 750603 121 2025-06-30 04:00:00 223.52000 223.82000 219.12000 219.39000 58887780.00000 220.63160 673189 122 2025-07-01 04:00:00 219.50000 221.87500 217.93000 220.46000 39256830.00000 220.15080 544150 123 2025-07-02 04:00:00 219.73000 221.60000 219.06000 219.92000 30894178.00000 220.21030 429633 124 2025-07-03 04:00:00 221.82000 224.01000 221.36000 223.41000 29632353.00000 222.88670 364422 125 2025-07-07 04:00:00 223.00000 224.29000 222.37000 223.47000 36604139.00000 223.41210 513469 126 2025-07-08 04:00:00 223.91500 224.00000 218.43000 219.36000 45691987.00000 220.48360 615447 127 2025-07-09 04:00:00 221.07000 224.29000 220.47000 222.54000 38155121.00000 222.45550 493756 128 2025-07-10 04:00:00 221.55000 222.79000 219.70000 222.26000 30370591.00000 221.70550 451223 129 2025-07-11 04:00:00 223.58000 226.67990 222.37000 225.02000 50518307.00000 224.89080 661385 130 2025-07-14 04:00:00 225.07000 226.66000 224.24000 225.69000 35702597.00000 225.61730 460428 131 2025-07-15 04:00:00 226.20000 227.27000 225.45500 226.35000 34907294.00000 226.49850 507705 132 2025-07-16 04:00:00 225.87500 226.10000 222.18000 223.19000 39535926.00000 223.78720 556155 133 2025-07-17 04:00:00 223.32000 224.50000 222.51000 223.88000 31855831.00000 223.69480 445580 134 2025-07-18 04:00:00 225.14000 226.40000 222.98000 226.13000 37833807.00000 225.29060 454003 135 2025-07-21 04:00:00 225.83500 229.69000 225.65000 229.30000 40297556.00000 228.32920 530328 136 2025-07-22 04:00:00 229.68000 230.00000 226.35000 227.47000 37483702.00000 227.86620 475209 137 2025-07-23 04:00:00 228.47000 228.79000 227.09000 228.29000 28294852.00000 228.17650 344504 138 2025-07-24 04:00:00 229.17000 236.00000 228.64000 232.23000 42902266.00000 231.80220 526287 139 2025-07-25 04:00:00 232.22000 232.48000 231.18000 231.44000 28712095.00000 231.78310 365765 140 2025-07-28 04:00:00 233.35000 234.29000 232.25000 232.79000 26300138.00000 233.09970 394809 141 2025-07-29 04:00:00 234.15000 234.72000 230.31000 231.01000 33716220.00000 231.48300 446598 142 2025-07-30 04:00:00 231.64000 231.80000 229.29000 230.19000 32993273.00000 230.92400 445462 143 2025-07-31 04:00:00 235.77000 236.53000 231.40000 234.11000 104357263.00000 232.41060 1254660 144 2025-08-01 04:00:00 217.21000 220.43990 212.80000 214.75000 122258801.00000 216.29730 1742475 145 2025-08-04 04:00:00 217.40000 217.44000 211.42000 211.65000 77890146.00000 213.13120 1046525 146 2025-08-05 04:00:00 213.05000 216.30000 212.87000 213.75000 51505121.00000 214.51420 639055 147 2025-08-06 04:00:00 214.69500 222.65000 213.74090 222.31000 54823045.00000 219.42990 654274 148 2025-08-07 04:00:00 221.00000 226.22000 220.82000 223.13000 40603513.00000 223.13570 553279 149 2025-08-08 04:00:00 223.14000 223.80000 221.88360 222.69000 32970477.00000 222.66980 397504 150 2025-08-11 04:00:00 221.78000 223.05000 220.40000 221.30000 31646222.00000 221.38650 441975 151 2025-08-12 04:00:00 222.23000 223.50000 219.05000 221.47000 37254707.00000 221.41240 472607 152 2025-08-13 04:00:00 222.00000 224.91850 222.00000 224.56000 36508335.00000 223.98460 488194 153 2025-08-14 04:00:00 227.40000 233.11000 227.02000 230.98000 61545824.00000 230.50200 738785 154 2025-08-15 04:00:00 232.58000 234.08000 229.80700 231.03000 39649244.00000 231.38570 495351 155 2025-08-18 04:00:00 230.22500 231.91000 228.33000 231.49000 25248890.00000 230.57910 371228 156 2025-08-19 04:00:00 230.09000 230.52830 227.12000 228.01000 29891012.00000 228.34230 405021 157 2025-08-20 04:00:00 227.12000 227.27000 220.91500 223.81000 36604319.00000 223.67450 493752 158 2025-08-21 04:00:00 222.65000 222.78000 220.50000 221.95000 32140459.00000 221.67140 415019 159 2025-08-22 04:00:00 222.79000 229.14000 220.82000 228.84000 37315341.00000 226.88500 490974 160 2025-08-25 04:00:00 227.35000 229.60000 227.31000 227.94000 22633695.00000 228.43500 348656 161 2025-08-26 04:00:00 227.11000 229.00000 226.02000 228.71000 26105373.00000 228.17790 279949 162 2025-08-27 04:00:00 228.57000 229.87000 227.81000 229.12000 21254479.00000 228.99400 308882 163 2025-08-28 04:00:00 229.00500 232.71000 228.02000 231.60000 33679585.00000 231.33380 388917 164 2025-08-29 04:00:00 231.32000 231.81250 228.16000 229.00000 26199170.00000 229.17020 344698 165 2025-09-02 04:00:00 223.52000 226.17000 221.83000 225.34000 38843883.00000 224.54110 508912 166 2025-09-03 04:00:00 225.21000 227.16990 224.36000 225.99000 29223134.00000 225.87080 422629 167 2025-09-04 04:00:00 231.18500 235.77000 230.78000 235.68000 59391779.00000 234.06250 707968 168 2025-09-05 04:00:00 235.19000 236.00000 231.93000 232.33000 36721802.00000 233.45160 493587 169 2025-09-08 04:00:00 234.94000 237.60000 233.75000 235.84000 33947104.00000 235.91620 489481 170 2025-09-09 04:00:00 236.35500 238.85000 235.08000 238.24000 27033778.00000 237.35240 391877 171 2025-09-10 04:00:00 237.51500 237.68000 229.09620 230.33000 60907714.00000 231.86090 806586 172 2025-09-11 04:00:00 231.49000 231.53000 229.33770 229.95000 37485598.00000 230.50740 465912 173 2025-09-12 04:00:00 230.35000 230.79000 226.29000 228.15000 38496218.00000 228.43490 534413 174 2025-09-15 04:00:00 230.62500 233.73000 230.32000 231.43000 33243328.00000 231.68480 488104 175 2025-09-16 04:00:00 232.93500 235.90000 232.23000 234.05000 38203912.00000 234.40880 497761 176 2025-09-17 04:00:00 233.77000 234.30000 228.71000 231.62000 42815230.00000 231.14290 528520 177 2025-09-18 04:00:00 232.50000 233.48000 228.79000 231.23000 37931738.00000 231.52120 477593 178 2025-09-19 04:00:00 232.37000 234.16000 229.70000 231.48000 97943172.00000 231.95660 477965 179 2025-09-22 04:00:00 230.56000 230.56500 227.51000 227.63000 45914506.00000 228.54920 637304 180 2025-09-23 04:00:00 227.83000 227.86000 220.07000 220.71000 70956193.00000 222.20620 975072 181 2025-09-24 04:00:00 224.15000 224.56000 219.45000 220.21000 49509033.00000 221.00590 695256 182 2025-09-25 04:00:00 220.06000 220.67000 216.47000 218.15000 52226328.00000 218.71490 721783 183 2025-09-26 04:00:00 219.08000 221.05000 218.02000 219.78000 41650098.00000 219.73700 551565 184 2025-09-29 04:00:00 220.08000 222.60000 219.30000 222.17000 44259177.00000 221.53600 520136 185 2025-09-30 04:00:00 222.03000 222.24000 217.89000 219.57000 48396369.00000 219.40040 609847 186 2025-10-01 04:00:00 217.36000 222.15000 216.61000 220.63000 43933834.00000 220.24920 586098 187 2025-10-02 04:00:00 221.01000 222.81000 218.94500 222.41000 41258586.00000 221.31820 589620 188 2025-10-03 04:00:00 223.44000 224.20000 219.34000 219.51000 43639033.00000 221.31910 605202 189 2025-10-06 04:00:00 221.00000 221.73000 216.03000 220.90000 43690876.00000 219.70950 690825 190 2025-10-07 04:00:00 220.88000 222.89000 220.17000 221.78000 31194678.00000 221.38080 500797 191 2025-10-08 04:00:00 222.92000 226.73000 221.19000 225.22000 46685985.00000 224.52340 624651 192 2025-10-09 04:00:00 224.99500 228.21000 221.75000 227.74000 46412122.00000 225.06130 660516 193 2025-10-10 04:00:00 226.21000 228.25000 216.00000 216.37000 72367511.00000 220.25770 1095476 194 2025-10-13 04:00:00 217.70000 220.68000 217.04000 220.07000 37809650.00000 219.73840 590235 195 2025-10-14 04:00:00 215.55500 219.32000 212.60000 216.39000 45665580.00000 216.41530 713414 196 2025-10-15 04:00:00 216.62000 217.71000 212.66000 215.57000 45909469.00000 215.58160 727130 197 2025-10-16 04:00:00 215.67000 218.59000 212.81010 214.47000 42414591.00000 215.51000 681784 198 2025-10-17 04:00:00 214.56000 214.80000 211.03000 213.04000 45986944.00000 213.01650 671699 199 2025-10-20 04:00:00 213.88000 216.69000 213.59000 216.48000 38882819.00000 215.59850 561944 200 2025-10-21 04:00:00 218.43000 223.32000 217.99000 222.03000 50494565.00000 221.51540 722930 201 2025-10-22 04:00:00 219.30000 220.00500 216.52000 217.95000 44308538.00000 218.11640 608630 202 2025-10-23 04:00:00 219.00000 221.30000 218.18000 221.09000 31539699.00000 220.39800 455056 203 2025-10-24 04:00:00 221.97000 225.40000 221.90000 224.21000 38684853.00000 224.04810 547597 204 2025-10-27 04:00:00 227.66000 228.40000 225.54000 226.97000 38266995.00000 227.23260 577811 205 2025-10-28 04:00:00 228.21500 231.48500 226.21000 229.25000 47099924.00000 228.83510 682226 206 2025-10-29 04:00:00 231.67200 232.82000 227.76000 230.30000 52035936.00000 230.03650 782147 207 2025-10-30 04:00:00 227.06000 228.44000 222.75000 222.86000 102252888.00000 231.31440 1399467 208 2025-10-31 04:00:00 250.10000 250.50000 243.98000 244.22000 166340683.00000 247.39120 1873179 209 2025-11-03 05:00:00 255.36000 258.60000 252.90000 254.00000 95997714.00000 255.41880 1284945 210 2025-11-04 05:00:00 250.38000 257.01000 248.66000 249.32000 51546311.00000 250.96710 777708 211 2025-11-05 05:00:00 249.03000 251.00000 246.16000 250.20000 40610602.00000 249.26070 598483 212 2025-11-06 05:00:00 249.15500 250.38000 242.17000 243.04000 46004201.00000 244.82420 697593 213 2025-11-07 05:00:00 242.90000 244.90000 238.49000 244.41000 46374294.00000 241.84930 667465 214 2025-11-10 05:00:00 248.34000 251.75000 245.59000 248.40000 36476474.00000 248.34500 591047 215 2025-11-11 05:00:00 248.41000 249.74990 247.23000 249.10000 23563960.00000 248.64500 378251 216 2025-11-12 05:00:00 250.23500 250.37000 243.75000 244.20000 31190063.00000 245.57220 489440 217 2025-11-13 05:00:00 243.05000 243.75000 236.50000 237.58000 41401638.00000 239.23370 614028 218 2025-11-14 05:00:00 235.06000 238.73000 232.89000 234.69000 38956619.00000 235.71950 630487 219 2025-11-17 05:00:00 233.25000 234.60000 229.19000 232.87000 59918908.00000 231.89100 893852 Polygon Pull Data This script uses the above function to perform the following:\nAttempt to read an existing pickle data file If a data file exists, then pull updated data Otherwise, pull all historical data available for that asset for the past 2 years (using the free tier from Polygon) Store pickle and/or excel files of the data in the specified directories Here\u0026rsquo;s the docstring with the parameters/variables:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 \u0026#34;\u0026#34;\u0026#34; Read existing data file, download price data from Polygon, and export data. Parameters: ----------- base_directory : any Root path to store downloaded data. ticker : str Ticker symbol to download. source : str Name of the data source (e.g., \u0026#39;Polygon\u0026#39;). asset_class : str Asset class name (e.g., \u0026#39;Equities\u0026#39;). start_date : datetime Start date for the data in datetime format. timespan : str Time span for the data (e.g., \u0026#34;minute\u0026#34;, \u0026#34;hour\u0026#34;, \u0026#34;day\u0026#34;, \u0026#34;week\u0026#34;, \u0026#34;month\u0026#34;, \u0026#34;quarter\u0026#34;, \u0026#34;year\u0026#34;). multiplier : int Multiplier for the time span (e.g., 1 for daily data). adjusted : bool If True, return adjusted data; if False, return raw data. force_existing_check : bool If True, force a complete check of the existing data file to verify that there are not any gaps in the data. free_tier : bool If True, then pause to avoid API limits. verbose : bool If True, print detailed information about the data being processed. excel_export : bool If True, export data to Excel format. pickle_export : bool If True, export data to Pickle format. output_confirmation : bool If True, print confirmation message. Returns: -------- None \u0026#34;\u0026#34;\u0026#34; Through the base_directory, source, and asset_class variables the script knows where in the local filesystem to look for an existing pickle file and the store the resulting updated pickle and/or excel files:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 current_year = datetime.now().year current_month = datetime.now().month current_day = datetime.now().day # Example usage - daily df = polygon_pull_data( base_directory=DATA_DIR, ticker=\u0026#34;AMZN\u0026#34;, source=\u0026#34;Polygon\u0026#34;, asset_class=\u0026#34;Equities\u0026#34;, start_date=datetime(current_year - 2, current_month, current_day), timespan=\u0026#34;day\u0026#34;, multiplier=1, adjusted=True, force_existing_check=True, free_tier=True, verbose=True, excel_export=True, pickle_export=True, output_confirmation=True, ) Here\u0026rsquo;s the output from above:\nDate open high low close volume vwap transactions otc 0 2023-07-28 04:00:00 129.69000 133.01000 129.33000 132.21000 46269781.00000 131.88370 413438 1 2023-07-31 04:00:00 133.20000 133.87000 132.38000 133.68000 41901516.00000 133.34100 406644 2 2023-08-01 04:00:00 133.55000 133.69000 131.61990 131.69000 42250989.00000 132.24700 385743 3 2023-08-02 04:00:00 130.15400 130.23000 126.82000 128.21000 50988614.00000 128.39730 532942 4 2023-08-03 04:00:00 127.48000 129.84000 126.41000 128.91000 90855736.00000 131.49410 746639 5 2023-08-04 04:00:00 141.06000 143.63000 139.32000 139.57000 153091370.00000 141.38990 1366815 6 2023-08-07 04:00:00 140.99000 142.54000 138.95000 142.22000 71213112.00000 141.00450 659134 7 2023-08-08 04:00:00 140.62000 140.84000 138.42000 139.94000 51706497.00000 139.59360 509773 8 2023-08-09 04:00:00 139.97000 140.32000 137.10000 137.85000 49910149.00000 138.32720 428760 9 2023-08-10 04:00:00 139.07500 140.41000 137.49000 138.56000 58804832.00000 138.87650 460780 10 2023-08-11 04:00:00 137.40000 139.33000 137.00000 138.41000 42905833.00000 138.36280 394897 11 2023-08-14 04:00:00 138.30000 140.59000 137.75000 140.57000 47148699.00000 139.64260 404084 12 2023-08-15 04:00:00 140.05000 141.27780 137.23000 137.67000 42781521.00000 138.36800 409077 13 2023-08-16 04:00:00 137.19000 137.27000 135.01000 135.07000 41675903.00000 136.02730 403275 14 2023-08-17 04:00:00 135.46000 136.08500 133.53000 133.98000 48354085.00000 134.68560 414339 15 2023-08-18 04:00:00 131.62000 134.06500 131.15000 133.22000 48497595.00000 132.78960 431787 16 2023-08-21 04:00:00 133.74000 135.18500 132.71000 134.68000 41442483.00000 134.14160 348372 17 2023-08-22 04:00:00 135.08000 135.65000 133.73000 134.25000 32935104.00000 134.58030 328047 18 2023-08-23 04:00:00 134.50000 135.95000 133.22000 135.52000 42786043.00000 134.98180 382201 19 2023-08-24 04:00:00 136.40000 136.77500 131.83000 131.84000 43646250.00000 133.16210 406488 20 2023-08-25 04:00:00 132.47000 133.87000 130.58000 133.26000 44147451.00000 132.52070 405801 21 2023-08-28 04:00:00 133.78000 133.95000 131.85000 133.14000 33956210.00000 132.71640 333029 22 2023-08-29 04:00:00 133.38000 135.14000 133.25000 134.91000 38646093.00000 134.46520 361304 23 2023-08-30 04:00:00 134.93000 135.68000 133.92000 135.07000 36130015.00000 134.99160 325589 24 2023-08-31 04:00:00 135.06000 138.78850 135.00000 138.01000 58431314.00000 137.79060 466038 25 2023-09-01 04:00:00 139.45500 139.96000 136.87500 138.12000 40986536.00000 137.97690 398729 26 2023-09-05 04:00:00 137.73000 137.80000 135.82000 137.27000 40636738.00000 136.87070 363848 27 2023-09-06 04:00:00 136.32000 137.45000 134.61000 135.36000 41785507.00000 135.64830 399464 28 2023-09-07 04:00:00 133.90000 138.03000 133.16000 137.85000 48498912.00000 136.56710 426888 29 2023-09-08 04:00:00 136.86000 138.85190 136.75000 138.23000 38365929.00000 138.07260 351930 30 2023-09-11 04:00:00 138.75000 143.62000 138.64000 143.10000 56764525.00000 141.83040 546830 31 2023-09-12 04:00:00 142.32000 143.00000 140.61000 141.23000 42668452.00000 141.65690 395583 32 2023-09-13 04:00:00 140.95000 144.98000 140.86500 144.85000 60465175.00000 143.89220 520766 33 2023-09-14 04:00:00 145.08000 145.86000 142.95000 144.72000 64033607.00000 144.81350 511529 34 2023-09-15 04:00:00 142.69000 143.57000 140.09000 140.39000 102909327.00000 141.01020 563980 35 2023-09-18 04:00:00 140.48000 141.75000 139.22000 139.98000 42823480.00000 140.07550 346514 36 2023-09-19 04:00:00 138.70000 138.84000 135.56000 137.63000 61482470.00000 136.97780 488183 37 2023-09-20 04:00:00 138.55000 139.37000 135.20000 135.29000 46263716.00000 136.91680 402932 38 2023-09-21 04:00:00 131.94000 132.24000 129.31000 129.33000 70343342.00000 130.54300 643638 39 2023-09-22 04:00:00 131.11000 132.03000 128.52000 129.12000 59904348.00000 130.15680 486108 40 2023-09-25 04:00:00 129.36000 131.78000 128.77000 131.27000 46017825.00000 130.85240 412022 41 2023-09-26 04:00:00 130.12000 130.39000 125.28000 125.98000 73048207.00000 126.94680 619791 42 2023-09-27 04:00:00 125.76000 127.48000 124.13000 125.98000 66553449.00000 125.42480 553871 43 2023-09-28 04:00:00 124.04000 126.58010 123.04000 125.98000 54554968.00000 125.08320 493974 44 2023-09-29 04:00:00 128.20000 129.15000 126.32000 127.12000 62411730.00000 127.57520 439844 45 2023-10-02 04:00:00 127.28000 130.47000 126.54000 129.46000 48029744.00000 128.93320 436813 46 2023-10-03 04:00:00 128.06000 128.52000 124.25440 124.72000 51564991.00000 125.47440 479691 47 2023-10-04 04:00:00 126.06000 127.36000 125.68000 127.00000 44203870.00000 126.58440 407400 48 2023-10-05 04:00:00 126.71000 126.73000 124.33000 125.96000 39660643.00000 125.61780 356274 49 2023-10-06 04:00:00 124.16000 128.45000 124.13000 127.96000 46836698.00000 126.72060 410287 50 2023-10-09 04:00:00 126.22000 128.79000 124.76000 128.26000 38773738.00000 126.99320 347740 51 2023-10-10 04:00:00 128.82000 130.74000 128.05000 129.48000 42178619.00000 129.65710 369849 52 2023-10-11 04:00:00 129.74000 132.05000 129.61000 131.83000 40741842.00000 130.78030 364559 53 2023-10-12 04:00:00 132.17000 134.48000 131.23000 132.33000 55528581.00000 132.84110 475931 54 2023-10-13 04:00:00 132.98000 133.31450 128.95000 129.79000 45824685.00000 130.60940 387721 55 2023-10-16 04:00:00 130.69000 133.07000 130.42500 132.55000 42832918.00000 132.11670 395337 56 2023-10-17 04:00:00 130.39000 132.58000 128.71000 131.47000 49344550.00000 131.01780 400903 57 2023-10-18 04:00:00 129.90000 130.66990 127.51000 128.13000 42699479.00000 128.94400 404633 58 2023-10-19 04:00:00 130.56500 132.24000 127.47000 128.40000 60961355.00000 129.83770 538133 59 2023-10-20 04:00:00 128.05000 128.17000 124.97000 125.17000 56406410.00000 125.94020 489134 60 2023-10-23 04:00:00 124.63000 127.88000 123.98000 126.56000 48259953.00000 126.30330 423355 61 2023-10-24 04:00:00 127.74000 128.80000 126.34000 128.56000 46477355.00000 127.88930 399787 62 2023-10-25 04:00:00 126.04000 126.34000 120.79000 121.39000 74577544.00000 122.73900 738252 63 2023-10-26 04:00:00 120.63000 121.63930 118.35000 119.57000 100419516.00000 120.57660 866563 64 2023-10-27 04:00:00 126.20000 130.02000 125.52000 127.74000 125309313.00000 128.12850 1056045 65 2023-10-30 04:00:00 129.72000 133.00000 128.56000 132.71000 72485542.00000 131.71620 616257 66 2023-10-31 04:00:00 132.75000 133.57000 131.71000 133.09000 51589380.00000 132.76070 432954 67 2023-11-01 04:00:00 133.96000 137.35000 133.71000 137.00000 61529409.00000 135.77890 552684 68 2023-11-02 04:00:00 138.73000 138.81000 136.47000 138.07000 52236693.00000 137.69960 464074 69 2023-11-03 04:00:00 138.99000 139.49000 137.45000 138.60000 44059805.00000 138.59210 406045 70 2023-11-06 05:00:00 138.76000 140.73000 138.36000 139.74000 44970417.00000 139.69530 392505 71 2023-11-07 05:00:00 140.55000 143.37000 140.50000 142.71000 53553537.00000 142.50430 469255 72 2023-11-08 05:00:00 142.97000 143.12000 141.21830 142.08000 44521658.00000 142.03970 375299 73 2023-11-09 05:00:00 142.02000 142.65000 139.84000 140.60000 36235367.00000 141.23940 338545 74 2023-11-10 05:00:00 140.46000 143.65000 139.91000 143.56000 49349937.00000 142.37800 410268 75 2023-11-13 05:00:00 142.08000 143.23000 140.67000 142.59000 35680570.00000 142.37690 339006 76 2023-11-14 05:00:00 145.00000 147.26000 144.68000 145.80000 56674551.00000 146.14300 508013 77 2023-11-15 05:00:00 147.06000 147.29000 142.59000 143.20000 63875720.00000 143.95700 550008 78 2023-11-16 05:00:00 140.91000 143.32000 139.52000 142.83000 49653512.00000 142.08020 444059 79 2023-11-17 05:00:00 142.66000 145.23000 142.54500 145.18000 49678437.00000 144.35820 381169 80 2023-11-20 05:00:00 145.13000 146.63000 144.72500 146.13000 41978766.00000 145.93680 380926 81 2023-11-21 05:00:00 143.91000 144.05000 141.50000 143.90000 71225992.00000 143.04300 566880 82 2023-11-22 05:00:00 144.57000 147.74000 144.57000 146.71000 45700002.00000 146.79290 428449 83 2023-11-24 05:00:00 146.70000 147.20000 145.32000 146.74000 22378379.00000 146.37180 217521 84 2023-11-27 05:00:00 147.53000 149.26000 146.88000 147.73000 53762428.00000 148.28730 465572 85 2023-11-28 05:00:00 146.98000 147.59990 145.53000 147.03000 42711682.00000 146.83460 383221 86 2023-11-29 05:00:00 147.85000 148.54000 145.97000 146.32000 40610907.00000 146.86330 358306 87 2023-11-30 05:00:00 144.76000 146.93000 144.33200 146.09000 65814022.00000 145.75850 460829 88 2023-12-01 05:00:00 146.00000 147.24500 145.55000 147.03000 39951833.00000 146.69090 361363 89 2023-12-04 05:00:00 145.25000 145.35000 142.81000 144.84000 48294244.00000 144.45210 462624 90 2023-12-05 05:00:00 143.55000 148.57000 143.13000 146.88000 46822411.00000 146.64810 446158 91 2023-12-06 05:00:00 147.58000 147.85000 144.28000 144.52000 39678960.00000 145.50340 365815 92 2023-12-07 05:00:00 146.15000 147.92000 145.34000 146.88000 52352830.00000 146.71790 382097 93 2023-12-08 05:00:00 145.48000 147.84000 145.40000 147.42000 41905965.00000 146.94050 343829 94 2023-12-11 05:00:00 145.66000 146.19000 143.64000 145.89000 50907288.00000 145.30010 435150 95 2023-12-12 05:00:00 145.52000 147.50000 145.30000 147.48000 44944264.00000 146.74660 338593 96 2023-12-13 05:00:00 148.12000 149.46000 146.82000 148.84000 52766196.00000 148.26080 438472 97 2023-12-14 05:00:00 149.93000 150.54000 145.52000 147.42000 58400848.00000 147.86870 532121 98 2023-12-15 05:00:00 148.38000 150.57000 147.88000 149.97000 110089342.00000 149.53490 492773 99 2023-12-18 05:00:00 150.56000 154.85500 150.05000 154.07000 62512828.00000 153.40710 511128 100 2023-12-19 05:00:00 154.40000 155.12000 152.69000 153.79000 43171292.00000 153.72170 384137 101 2023-12-20 05:00:00 152.90000 155.63000 151.56000 152.12000 50322106.00000 153.68760 461445 102 2023-12-21 05:00:00 153.30000 153.97000 152.10000 153.84000 36305733.00000 153.10910 350098 103 2023-12-22 05:00:00 153.77000 154.35000 152.71000 153.42000 29514093.00000 153.54440 310942 104 2023-12-26 05:00:00 153.56000 153.97500 153.03000 153.41000 25067222.00000 153.40820 282620 105 2023-12-27 05:00:00 153.56000 154.78000 153.12000 153.34000 31434733.00000 153.64380 311463 106 2023-12-28 05:00:00 153.72000 154.08000 152.95000 153.38000 27057002.00000 153.52690 301925 107 2023-12-29 05:00:00 153.10000 153.89000 151.03000 151.94000 39823204.00000 152.32750 353441 108 2024-01-02 05:00:00 151.54000 152.38000 148.39000 149.93000 47339424.00000 149.85270 476433 109 2024-01-03 05:00:00 149.20000 151.05000 148.33000 148.47000 49425495.00000 149.40580 416955 110 2024-01-04 05:00:00 145.59000 147.38000 144.05000 144.57000 56039807.00000 145.41600 541342 111 2024-01-05 05:00:00 144.69000 146.59000 144.53000 145.24000 45153147.00000 145.45130 380386 112 2024-01-08 05:00:00 146.74000 149.40000 146.15000 149.10000 46757053.00000 148.32380 410733 113 2024-01-09 05:00:00 148.33000 151.71000 148.21000 151.37000 43812567.00000 150.63110 403026 114 2024-01-10 05:00:00 152.06000 154.42000 151.88100 153.73000 44421830.00000 153.45230 430517 115 2024-01-11 05:00:00 155.04000 157.17000 153.12000 155.18000 49072691.00000 155.01810 468131 116 2024-01-12 05:00:00 155.39000 156.20000 154.01000 154.62000 40484155.00000 154.82380 351914 117 2024-01-16 05:00:00 153.53000 154.99000 152.15000 153.16000 41384636.00000 153.13430 396071 118 2024-01-17 05:00:00 151.49000 152.15000 149.91000 151.71000 34953363.00000 151.22870 354674 119 2024-01-18 05:00:00 152.77000 153.77500 151.82000 153.50000 37850245.00000 153.13340 344845 120 2024-01-19 05:00:00 153.83000 155.76000 152.74000 155.34000 51651628.00000 154.61140 408186 121 2024-01-22 05:00:00 156.89000 157.05000 153.90000 154.78000 43687468.00000 154.94180 427992 122 2024-01-23 05:00:00 154.85000 156.21000 153.93000 156.02000 37986039.00000 155.32390 353578 123 2024-01-24 05:00:00 157.80000 158.51000 156.48000 156.87000 48547315.00000 157.40010 435422 124 2024-01-25 05:00:00 156.95000 158.50500 154.55010 157.75000 43638592.00000 157.15180 455334 125 2024-01-26 05:00:00 158.42000 160.72000 157.91000 159.12000 51047353.00000 159.18480 426676 126 2024-01-29 05:00:00 159.34000 161.29000 158.90000 161.26000 45270385.00000 160.27690 447373 127 2024-01-30 05:00:00 160.70000 161.73000 158.49000 159.00000 45207430.00000 159.49890 458714 128 2024-01-31 05:00:00 157.00000 159.01000 154.81000 155.20000 50284371.00000 156.26920 532783 129 2024-02-01 05:00:00 155.87000 159.76000 155.62000 159.28000 76542419.00000 160.68910 742980 130 2024-02-02 05:00:00 169.19000 172.50000 167.33000 171.81000 117218313.00000 170.93570 1134751 131 2024-02-05 05:00:00 170.20000 170.55000 167.70000 170.31000 55081297.00000 169.52430 572539 132 2024-02-06 05:00:00 169.39000 170.71000 167.65000 169.15000 42505518.00000 168.88130 406692 133 2024-02-07 05:00:00 169.48000 170.88000 168.94000 170.53000 47174060.00000 170.18100 452958 134 2024-02-08 05:00:00 169.65000 171.43000 168.88000 169.84000 42316454.00000 170.26670 399952 135 2024-02-09 05:00:00 170.90000 175.00000 170.58030 174.45000 56985986.00000 173.39480 549269 136 2024-02-12 05:00:00 174.80000 175.39000 171.54000 172.34000 51050440.00000 173.29750 512028 137 2024-02-13 05:00:00 167.73000 170.95000 165.75000 168.64000 56345122.00000 168.87800 572688 138 2024-02-14 05:00:00 169.21000 171.21000 168.28000 170.98000 42815544.00000 170.03340 457093 139 2024-02-15 05:00:00 170.58000 171.17000 167.59000 169.80000 49855196.00000 169.43540 521440 140 2024-02-16 05:00:00 168.74000 170.42000 167.17000 169.51000 48107744.00000 169.18230 468195 141 2024-02-20 05:00:00 167.83000 168.71000 165.74000 167.08000 41980326.00000 166.90990 478320 142 2024-02-21 05:00:00 168.94000 170.23000 167.14000 168.59000 44575623.00000 168.54600 466026 143 2024-02-22 05:00:00 173.10000 174.80000 171.77000 174.58000 55392354.00000 173.54070 547144 144 2024-02-23 05:00:00 174.28000 175.75000 173.70000 174.99000 59715243.00000 174.74410 467172 145 2024-02-26 05:00:00 175.70000 176.37000 174.26000 174.73000 44368614.00000 174.99550 428536 146 2024-02-27 05:00:00 174.07500 174.62000 172.86000 173.54000 31141732.00000 173.75410 357527 147 2024-02-28 05:00:00 172.44000 174.05000 172.27000 173.16000 28180482.00000 173.31900 328701 148 2024-02-29 05:00:00 173.01000 177.22000 172.85000 176.76000 53805359.00000 175.70370 446966 149 2024-03-01 05:00:00 176.75000 178.72500 176.07000 178.22000 31981152.00000 177.86120 406700 150 2024-03-04 05:00:00 177.53000 180.14000 177.49000 177.58000 37381520.00000 178.53910 443556 151 2024-03-05 05:00:00 176.93000 176.93000 173.30300 174.12000 37228343.00000 174.46690 448860 152 2024-03-06 05:00:00 175.54000 176.46000 173.26000 173.51000 32090926.00000 174.32090 389497 153 2024-03-07 05:00:00 174.83000 177.99000 173.72000 176.82000 34063283.00000 176.43060 379859 154 2024-03-08 05:00:00 176.44000 178.78500 174.33000 175.35000 37893242.00000 176.21050 443161 155 2024-03-11 04:00:00 174.31000 174.47000 171.47000 171.96000 28484777.00000 172.42310 409871 156 2024-03-12 04:00:00 173.50000 176.76000 171.98000 175.39000 36610604.00000 175.30760 416722 157 2024-03-13 04:00:00 175.90000 177.62000 175.55000 176.55500 30772600.00000 176.72910 362923 158 2024-03-14 04:00:00 177.69000 179.53000 176.46500 178.75000 43705840.00000 178.41570 461845 159 2024-03-15 04:00:00 176.64000 177.93000 173.90000 174.42000 72147390.00000 175.04660 465353 160 2024-03-18 04:00:00 175.80000 176.69000 174.28000 174.48000 31250688.00000 175.08800 358797 161 2024-03-19 04:00:00 174.21500 176.09000 173.52000 175.90000 26880893.00000 175.36850 318062 162 2024-03-20 04:00:00 176.14000 178.53000 174.64000 178.15000 29947150.00000 176.68970 355615 163 2024-03-21 04:00:00 179.98800 181.41500 178.15000 178.15000 32824320.00000 179.32110 377254 164 2024-03-22 04:00:00 177.75200 179.25500 176.75000 178.87000 27995378.00000 178.54950 295011 165 2024-03-25 04:00:00 178.01000 180.99000 177.24000 179.71000 29815464.00000 179.64920 352597 166 2024-03-26 04:00:00 180.15000 180.45000 177.95000 178.30000 29658982.00000 179.12280 326743 167 2024-03-27 04:00:00 179.88000 180.00000 177.30990 179.83000 33272551.00000 179.10690 341921 168 2024-03-28 04:00:00 180.17000 181.70000 179.26000 180.38000 38051588.00000 180.48630 364803 169 2024-04-01 04:00:00 180.79000 183.00000 179.95000 180.97000 29174521.00000 180.91750 370612 170 2024-04-02 04:00:00 179.07000 180.79000 178.37620 180.69000 32611546.00000 180.07500 355849 171 2024-04-03 04:00:00 179.90000 182.87000 179.80000 182.41000 31046638.00000 182.08490 350783 172 2024-04-04 04:00:00 184.00000 185.10000 180.00000 180.00000 41624261.00000 182.53150 465372 173 2024-04-05 04:00:00 182.38000 186.27000 181.97000 185.07000 42373992.00000 184.70210 465500 174 2024-04-08 04:00:00 186.90000 187.29000 184.81000 185.19000 39221282.00000 185.80410 427793 175 2024-04-09 04:00:00 187.24000 187.34000 184.20000 185.67000 36546946.00000 185.50470 400971 176 2024-04-10 04:00:00 182.76500 186.26990 182.67000 185.95000 35879151.00000 185.03440 448182 177 2024-04-11 04:00:00 186.74000 189.77000 185.51000 189.05000 40020742.00000 187.97000 445674 178 2024-04-12 04:00:00 187.72000 188.38000 185.08000 186.13000 38608849.00000 186.56130 420407 179 2024-04-15 04:00:00 187.42500 188.69000 183.00000 183.62000 48052395.00000 185.45710 507835 180 2024-04-16 04:00:00 183.27000 184.83000 182.26000 183.32000 32891265.00000 183.69700 400887 181 2024-04-17 04:00:00 184.31000 184.57000 179.82000 181.28000 31359673.00000 182.03670 395806 182 2024-04-18 04:00:00 181.47000 182.39000 178.65000 179.22000 30723793.00000 180.26110 383902 183 2024-04-19 04:00:00 178.74000 179.00000 173.44000 174.63000 56000729.00000 175.53150 569859 184 2024-04-22 04:00:00 176.94000 178.87000 174.56000 177.23000 37924890.00000 176.72530 396300 185 2024-04-23 04:00:00 178.08000 179.93000 175.97500 179.54000 37046519.00000 178.53150 379412 186 2024-04-24 04:00:00 179.94000 180.32300 176.18000 176.59000 34185109.00000 177.22800 405353 187 2024-04-25 04:00:00 169.68000 173.92000 166.32000 173.67000 49249390.00000 172.03570 618272 188 2024-04-26 04:00:00 177.79500 180.82000 176.13000 179.62000 43919765.00000 178.97230 478171 189 2024-04-29 04:00:00 182.75000 183.53000 179.39000 180.96000 54063937.00000 180.66190 592823 190 2024-04-30 04:00:00 181.09000 182.99000 174.80000 175.00000 94639786.00000 178.70140 852830 191 2024-05-01 04:00:00 181.63500 185.15000 176.56000 179.00000 94645148.00000 180.39230 964487 192 2024-05-02 04:00:00 180.85000 185.10000 179.91000 184.72000 54303510.00000 183.19370 529460 193 2024-05-03 04:00:00 186.99000 187.87000 185.42000 186.21000 39172004.00000 186.42780 413924 194 2024-05-06 04:00:00 186.28000 188.74500 184.80000 188.70000 34725295.00000 187.05830 394638 195 2024-05-07 04:00:00 188.92000 189.94000 187.30500 188.76000 34048902.00000 188.74660 400757 196 2024-05-08 04:00:00 187.44000 188.43000 186.38500 188.00000 26136350.00000 187.63680 306857 197 2024-05-09 04:00:00 188.88000 191.70000 187.44000 189.50000 43368377.00000 189.94020 460881 198 2024-05-10 04:00:00 189.16000 189.89200 186.93000 187.48000 34141771.00000 188.03490 355996 199 2024-05-13 04:00:00 188.00000 188.31000 185.36000 186.57000 24898613.00000 186.43460 333255 200 2024-05-14 04:00:00 183.82000 187.72000 183.45000 187.07000 38698155.00000 185.92350 426722 201 2024-05-15 04:00:00 185.97000 186.71930 182.73000 185.99000 75459927.00000 184.73150 786217 202 2024-05-16 04:00:00 185.60000 187.31000 183.46000 183.63000 38834450.00000 185.19240 411940 203 2024-05-17 04:00:00 183.76000 185.30000 183.35000 184.70000 33175655.00000 184.53620 339447 204 2024-05-20 04:00:00 184.34000 186.66500 183.28000 183.54000 30511768.00000 184.44880 371996 205 2024-05-21 04:00:00 182.30000 183.26000 180.75000 183.15000 50839129.00000 181.86870 515624 206 2024-05-22 04:00:00 183.88000 185.22000 181.97150 183.13000 28148784.00000 183.45840 344846 207 2024-05-23 04:00:00 183.66000 184.76000 180.08000 181.05000 33670173.00000 182.12140 384775 208 2024-05-24 04:00:00 181.65000 182.43500 180.30000 180.75000 27471640.00000 181.36740 333234 209 2024-05-28 04:00:00 179.93000 182.24000 179.49000 182.15000 29926963.00000 181.18010 401075 210 2024-05-29 04:00:00 181.70000 184.08000 181.55000 182.02000 32009294.00000 182.60750 361239 211 2024-05-30 04:00:00 181.31000 181.34000 178.35500 179.32000 29249229.00000 179.80240 392763 212 2024-05-31 04:00:00 178.30000 179.21000 173.87000 176.44000 58903939.00000 175.65950 614468 213 2024-06-03 04:00:00 177.70000 178.70000 175.92000 178.34000 30786640.00000 177.39560 415225 214 2024-06-04 04:00:00 177.64000 179.82000 176.44000 179.34000 27198388.00000 178.58650 329595 215 2024-06-05 04:00:00 180.10000 181.50000 178.75000 181.28000 32116394.00000 180.73430 362053 216 2024-06-06 04:00:00 181.74500 185.00000 181.49000 185.00000 31371151.00000 183.73910 374051 217 2024-06-07 04:00:00 184.90000 186.28880 183.36000 184.30000 28021473.00000 184.88730 336108 218 2024-06-10 04:00:00 184.07000 187.23000 183.79000 187.06000 34494498.00000 186.40050 385196 219 2024-06-11 04:00:00 187.06000 187.77000 184.53730 187.23000 27265108.00000 186.26750 341795 220 2024-06-12 04:00:00 188.01500 188.35000 185.43000 186.89000 33984216.00000 187.06950 443192 221 2024-06-13 04:00:00 186.09000 187.67000 182.66600 183.83000 39721545.00000 184.43730 422857 222 2024-06-14 04:00:00 183.08000 183.72000 182.23000 183.66000 25456410.00000 183.21230 307405 223 2024-06-17 04:00:00 182.52000 185.00000 181.22000 184.06000 35601907.00000 183.31320 432322 224 2024-06-18 04:00:00 183.73500 184.29000 181.43000 182.81000 36659157.00000 182.66190 390343 225 2024-06-20 04:00:00 182.91000 186.51000 182.72000 186.10000 44726779.00000 185.17980 471760 226 2024-06-21 04:00:00 187.80000 189.27500 185.86000 189.08000 72931754.00000 188.12720 435848 227 2024-06-24 04:00:00 189.33000 191.00000 185.33000 185.57000 50610379.00000 187.24570 535367 228 2024-06-25 04:00:00 186.81000 188.84000 185.42000 186.34000 45898475.00000 186.95200 382085 229 2024-06-26 04:00:00 186.92000 194.80000 186.26000 193.61000 65103893.00000 192.11170 649554 230 2024-06-27 04:00:00 195.00500 199.84000 194.20000 197.85000 74397491.00000 197.31910 721971 231 2024-06-28 04:00:00 197.73000 198.85000 192.50000 193.25000 76930192.00000 194.67100 605538 232 2024-07-01 04:00:00 193.49000 198.29570 192.82000 197.20000 41192011.00000 196.11290 501335 233 2024-07-02 04:00:00 197.28000 200.43000 195.93000 200.00000 45600013.00000 198.78590 485437 234 2024-07-03 04:00:00 199.94000 200.02900 196.76010 197.59000 31597926.00000 198.03710 395364 235 2024-07-05 04:00:00 198.65000 200.55000 198.17000 200.00000 39858885.00000 199.86380 451244 236 2024-07-08 04:00:00 200.04000 201.20000 197.96000 199.29000 34767261.00000 199.42110 443701 237 2024-07-09 04:00:00 199.40000 200.57000 199.04500 199.34000 32756736.00000 199.78930 379497 238 2024-07-10 04:00:00 199.99500 200.11000 197.69000 199.79000 32883753.00000 199.27910 414183 239 2024-07-11 04:00:00 200.09000 200.26990 192.86000 195.05000 44565041.00000 195.76060 582321 240 2024-07-12 04:00:00 194.80000 196.47000 193.83010 194.49000 30598525.00000 194.94440 427388 241 2024-07-15 04:00:00 194.56000 196.19000 190.83000 192.72000 40683227.00000 193.01880 571566 242 2024-07-16 04:00:00 195.59000 196.62000 192.24000 193.02000 33994714.00000 193.69970 449302 243 2024-07-17 04:00:00 191.35000 191.58000 185.99000 187.93000 48076139.00000 187.63630 682876 244 2024-07-18 04:00:00 189.59000 189.68000 181.44800 183.75000 51043626.00000 184.40920 672039 245 2024-07-19 04:00:00 181.14000 184.93000 180.11000 183.13000 43081829.00000 182.87500 482540 246 2024-07-22 04:00:00 185.00000 185.06000 182.48000 182.55000 39931923.00000 183.37980 448824 247 2024-07-23 04:00:00 184.10000 189.39000 183.56000 186.41000 47537670.00000 187.11860 489929 248 2024-07-24 04:00:00 183.20000 185.45000 180.41000 180.83000 41532360.00000 182.21570 506965 249 2024-07-25 04:00:00 182.91000 183.89580 176.80000 179.85000 44464163.00000 180.87870 548308 250 2024-07-26 04:00:00 180.39000 183.19000 180.24000 182.50000 29505964.00000 181.98440 375650 251 2024-07-29 04:00:00 183.84000 184.75000 182.38000 183.20000 33270123.00000 183.45090 350828 252 2024-07-30 04:00:00 184.72000 185.86000 179.38000 181.71000 39508574.00000 181.67820 442154 253 2024-07-31 04:00:00 185.05000 187.94000 184.46000 186.98000 41667326.00000 186.55510 474654 254 2024-08-01 04:00:00 189.28500 190.60000 181.87000 184.07000 70435635.00000 183.48880 855378 255 2024-08-02 04:00:00 166.75000 168.77000 160.55000 167.90000 141448365.00000 165.53850 1668927 256 2024-08-05 04:00:00 154.21000 162.96000 151.61000 161.02000 83149437.00000 159.77180 1147517 257 2024-08-06 04:00:00 161.71000 165.08000 158.54040 161.93000 59950830.00000 162.28950 674745 258 2024-08-07 04:00:00 166.55000 167.58000 161.43000 162.77000 48408240.00000 164.37050 571697 259 2024-08-08 04:00:00 165.16500 166.68990 162.55000 165.80000 44616206.00000 165.21360 483376 260 2024-08-09 04:00:00 166.40000 168.55000 165.85000 166.94000 36401049.00000 167.00360 394854 261 2024-08-12 04:00:00 168.14000 168.55000 166.11010 166.80000 30072788.00000 167.22660 393788 262 2024-08-13 04:00:00 167.81000 171.04000 167.10000 170.23000 39237915.00000 169.22460 411872 263 2024-08-14 04:00:00 172.11000 172.28000 168.86000 170.10000 28843804.00000 170.22110 376508 264 2024-08-15 04:00:00 174.86000 177.91000 173.99000 177.59000 51698513.00000 176.34210 530507 265 2024-08-16 04:00:00 177.04000 178.34000 176.26010 177.06000 31489175.00000 177.44090 368932 266 2024-08-19 04:00:00 177.64000 178.30000 176.16000 178.22000 31129807.00000 177.61160 358311 267 2024-08-20 04:00:00 177.92000 179.01000 177.43080 178.88000 26255204.00000 178.39470 302465 268 2024-08-21 04:00:00 179.92000 182.38500 178.89370 180.11000 35599120.00000 180.36520 398151 269 2024-08-22 04:00:00 181.38000 181.47000 175.68000 176.13000 32047482.00000 177.58770 402700 270 2024-08-23 04:00:00 177.34000 178.96990 175.24000 177.04000 29150091.00000 177.08980 359923 271 2024-08-26 04:00:00 176.70000 177.46820 174.30000 175.50000 22366236.00000 175.69820 337930 272 2024-08-27 04:00:00 174.15000 174.89000 172.25000 173.12000 29841979.00000 173.22610 377150 273 2024-08-28 04:00:00 173.69000 173.69000 168.92000 170.80000 29045025.00000 170.83950 382296 274 2024-08-29 04:00:00 173.22000 174.29000 170.81000 172.12000 26407815.00000 172.74240 358875 275 2024-08-30 04:00:00 172.78000 178.90000 172.60000 178.50000 43429355.00000 176.58140 398639 276 2024-09-03 04:00:00 177.55000 178.26000 175.26000 176.25000 37817511.00000 176.46460 466728 277 2024-09-04 04:00:00 174.48000 175.98000 172.54000 173.33000 30309225.00000 173.90510 378823 278 2024-09-05 04:00:00 175.00000 179.87500 174.99500 177.89000 40170526.00000 177.62310 458086 279 2024-09-06 04:00:00 177.24000 178.38000 171.16000 171.39000 41466537.00000 172.74110 447653 280 2024-09-09 04:00:00 174.53000 175.85000 173.51000 175.40000 29037362.00000 174.78370 347055 281 2024-09-10 04:00:00 177.49000 180.50000 176.79000 179.55000 36233796.00000 179.12900 419587 282 2024-09-11 04:00:00 180.09500 184.99000 175.73000 184.52000 42564698.00000 181.22080 465492 283 2024-09-12 04:00:00 184.80000 187.41000 183.54000 187.00000 33622483.00000 186.01350 431083 284 2024-09-13 04:00:00 187.00000 188.50000 185.91000 186.49000 26495351.00000 187.16960 338171 285 2024-09-16 04:00:00 185.29000 185.81000 183.36000 184.89000 26065485.00000 184.67970 353780 286 2024-09-17 04:00:00 186.85000 189.45000 186.14000 186.88000 26091682.00000 187.51740 351410 287 2024-09-18 04:00:00 186.45000 188.80000 185.06000 186.43000 34448130.00000 186.76450 381348 288 2024-09-19 04:00:00 190.04000 190.99000 188.47000 189.87000 39543168.00000 189.67790 422926 289 2024-09-20 04:00:00 190.23000 191.84000 187.41000 191.60000 100378553.00000 190.87080 430701 290 2024-09-23 04:00:00 191.64000 194.45000 190.57000 193.88000 36993111.00000 193.09950 405139 291 2024-09-24 04:00:00 194.27000 195.37000 190.13000 193.96000 43478926.00000 193.04800 404498 292 2024-09-25 04:00:00 193.75000 193.94980 192.16000 192.53000 26391144.00000 192.79740 320595 293 2024-09-26 04:00:00 194.31000 194.53000 189.54000 191.16000 36334854.00000 191.31160 411701 294 2024-09-27 04:00:00 190.68000 190.90000 187.34000 187.97000 36002316.00000 188.63340 401802 295 2024-09-30 04:00:00 187.14000 188.49000 184.65000 186.33000 41680400.00000 186.21580 440558 296 2024-10-01 04:00:00 184.90000 186.19000 183.45190 185.13000 36044906.00000 184.71470 440041 297 2024-10-02 04:00:00 184.44000 186.60000 184.04000 184.76000 23704056.00000 185.14070 314477 298 2024-10-03 04:00:00 183.04500 183.44000 180.87500 181.96000 30204302.00000 182.19790 398492 299 2024-10-04 04:00:00 185.75000 187.60000 183.60000 186.51000 41079011.00000 185.73070 431150 300 2024-10-07 04:00:00 182.95000 183.60000 180.25000 180.80000 42364201.00000 181.27820 520248 301 2024-10-08 04:00:00 181.91500 183.09000 180.92000 182.72000 26372086.00000 182.33670 314225 302 2024-10-09 04:00:00 182.82000 185.84500 182.05000 185.17000 26343117.00000 184.50090 345913 303 2024-10-10 04:00:00 187.13000 188.13400 185.83000 186.65000 27785043.00000 186.88770 353745 304 2024-10-11 04:00:00 186.63000 189.92840 186.30000 188.82000 25751557.00000 188.80120 342262 305 2024-10-14 04:00:00 189.78000 189.83000 187.36000 187.54000 22614407.00000 188.19830 328273 306 2024-10-15 04:00:00 187.63000 188.41000 184.58000 187.69000 32178925.00000 186.86130 433036 307 2024-10-16 04:00:00 187.05000 187.78000 185.61000 186.89000 23456812.00000 186.67470 325887 308 2024-10-17 04:00:00 188.22000 188.94000 186.00000 187.53000 25039414.00000 187.66460 327337 309 2024-10-18 04:00:00 187.15000 190.74000 186.28000 188.99000 37417670.00000 189.12250 375969 310 2024-10-21 04:00:00 188.05000 189.46000 186.40000 189.07000 24639393.00000 188.31480 329960 311 2024-10-22 04:00:00 188.35000 191.52010 186.97500 189.70000 29650593.00000 189.69990 333875 312 2024-10-23 04:00:00 188.85000 189.16000 183.69000 184.71000 31937089.00000 185.68530 402057 313 2024-10-24 04:00:00 185.25000 187.11000 183.86000 186.38000 21647395.00000 186.13950 300448 314 2024-10-25 04:00:00 187.85000 190.45000 187.53000 187.83000 29362060.00000 188.71070 357522 315 2024-10-28 04:00:00 189.56500 190.21000 188.21000 188.39000 27930836.00000 188.95010 330722 316 2024-10-29 04:00:00 188.58000 191.45990 187.81500 190.83000 35690158.00000 190.34370 371710 317 2024-10-30 04:00:00 194.69500 195.61000 192.42000 192.73000 37707550.00000 193.49910 478472 318 2024-10-31 04:00:00 190.51000 190.60000 185.23000 186.40000 75146759.00000 188.84460 836777 319 2024-11-01 04:00:00 199.00000 200.50000 197.02000 197.93000 99687847.00000 198.80050 1003136 320 2024-11-04 05:00:00 196.45000 197.33000 194.31010 195.78000 38492062.00000 196.00490 476483 321 2024-11-05 05:00:00 196.04000 199.82000 195.99000 199.50000 30564784.00000 198.46420 372051 322 2024-11-06 05:00:00 200.01000 207.55000 199.14000 207.09000 72292167.00000 204.39310 848126 323 2024-11-07 05:00:00 207.44000 212.25000 207.19000 210.05000 52878383.00000 210.12820 610078 324 2024-11-08 05:00:00 209.72000 209.96330 207.44000 208.18000 36075846.00000 208.65000 443816 325 2024-11-11 05:00:00 208.50000 209.65000 205.59000 206.84000 35456012.00000 206.88870 424873 326 2024-11-12 05:00:00 208.37000 209.54000 206.01000 208.91000 38942918.00000 208.09130 476724 327 2024-11-13 05:00:00 209.40000 215.09000 209.14000 214.10000 46212943.00000 212.85230 529093 328 2024-11-14 05:00:00 214.16000 215.90000 210.88000 211.48000 42620309.00000 212.34670 463284 329 2024-11-15 05:00:00 206.76000 207.34000 199.61000 202.61000 86591144.00000 203.31350 968372 330 2024-11-18 05:00:00 204.15000 204.67000 200.95000 201.70000 36512465.00000 202.45540 425696 331 2024-11-19 05:00:00 199.33000 205.30000 198.78000 204.61000 31197867.00000 203.30520 383403 332 2024-11-20 05:00:00 202.98000 203.13000 199.45000 202.88000 32768989.00000 201.71210 406917 333 2024-11-21 05:00:00 203.49000 203.49000 195.75000 198.38000 58800042.00000 198.69470 736592 334 2024-11-22 05:00:00 198.25000 199.25990 196.75000 197.12000 31530844.00000 197.58660 402719 335 2024-11-25 05:00:00 199.28000 201.94990 199.00000 201.45000 40685672.00000 200.84360 427416 336 2024-11-26 05:00:00 201.90000 208.00000 201.79000 207.86000 41673737.00000 206.27400 496396 337 2024-11-27 05:00:00 206.98000 207.64000 205.05000 205.74000 28061638.00000 206.09780 350396 338 2024-11-29 05:00:00 205.83000 208.20000 204.59000 207.89000 24892447.00000 206.77660 263735 339 2024-12-02 05:00:00 209.96000 212.99000 209.51010 210.71000 39523185.00000 210.96540 461819 340 2024-12-03 05:00:00 210.31000 214.02000 209.65000 213.44000 32214828.00000 212.63940 379348 341 2024-12-04 05:00:00 215.96000 220.00000 215.75000 218.16000 48745716.00000 218.16240 553276 342 2024-12-05 05:00:00 218.03000 222.15000 217.30000 220.55000 41140220.00000 220.17580 463440 343 2024-12-06 05:00:00 220.75000 227.15000 220.60000 227.03000 44178069.00000 225.05480 524366 344 2024-12-09 05:00:00 227.21000 230.08000 225.67000 226.09000 46819363.00000 227.70600 572754 345 2024-12-10 05:00:00 226.09000 229.06000 224.20020 225.04000 31199864.00000 226.26390 417668 346 2024-12-11 05:00:00 226.41000 231.20000 226.26000 230.26000 35385785.00000 229.90090 427394 347 2024-12-12 05:00:00 229.83000 231.09000 227.63000 228.97000 28204084.00000 229.33950 342638 348 2024-12-13 05:00:00 228.40000 230.20000 225.86080 227.46000 28768080.00000 227.85530 362555 349 2024-12-16 05:00:00 230.23000 233.00000 228.01000 232.93000 37552096.00000 231.54840 430659 350 2024-12-17 05:00:00 232.39000 232.73000 227.85000 231.15000 35948131.00000 230.78970 433266 351 2024-12-18 05:00:00 230.77000 231.39990 220.11000 220.52000 43281443.00000 225.24410 529103 352 2024-12-19 05:00:00 224.91000 226.09000 222.92000 223.29000 39918739.00000 224.02620 466361 353 2024-12-20 05:00:00 219.84000 226.21000 218.73000 224.92000 88279184.00000 223.94970 456278 354 2024-12-23 05:00:00 225.01000 226.88000 223.90000 225.06000 28070007.00000 225.27080 321899 355 2024-12-24 05:00:00 226.94000 229.14000 226.13000 229.05000 15007497.00000 228.36190 203039 356 2024-12-26 05:00:00 228.50000 228.50000 226.67060 227.05000 16174500.00000 227.61460 262085 357 2024-12-27 05:00:00 225.60000 226.03000 220.90000 223.75000 27367147.00000 223.01460 381077 358 2024-12-30 05:00:00 220.06000 222.99720 218.43000 221.30000 28321240.00000 220.88500 371497 359 2024-12-31 05:00:00 222.96500 223.22990 218.94000 219.39000 24819655.00000 220.40050 308613 360 2025-01-02 05:00:00 222.03000 225.15000 218.19000 220.22000 33956579.00000 221.27450 449631 361 2025-01-03 05:00:00 222.50500 225.36000 221.62000 224.19000 27515606.00000 223.70500 346976 362 2025-01-06 05:00:00 226.78000 228.83500 224.84000 227.61000 31849831.00000 227.09210 410686 363 2025-01-07 05:00:00 227.90000 228.38100 221.46000 222.11000 28084164.00000 223.40330 379570 364 2025-01-08 05:00:00 223.18500 223.52000 220.20000 222.13000 25033292.00000 222.04140 325539 365 2025-01-10 05:00:00 221.46000 221.71000 216.50000 218.94000 36811525.00000 218.86400 493840 366 2025-01-13 05:00:00 218.06000 219.40000 216.47000 218.46000 27262655.00000 218.14260 373519 367 2025-01-14 05:00:00 220.44000 221.82000 216.20000 217.76000 24711650.00000 218.62450 332022 368 2025-01-15 05:00:00 222.83000 223.57000 220.75000 223.35000 31291257.00000 222.66900 353985 369 2025-01-16 05:00:00 224.42000 224.65000 220.31000 220.66000 24757276.00000 221.89420 313323 370 2025-01-17 05:00:00 225.84000 226.51000 223.08000 225.94000 42370123.00000 225.39270 385914 371 2025-01-21 05:00:00 228.90000 231.78000 226.94000 230.71000 39951456.00000 230.09010 552447 372 2025-01-22 05:00:00 232.02000 235.44000 231.19000 235.01000 41448217.00000 234.09500 512233 373 2025-01-23 05:00:00 234.10000 235.52000 231.51000 235.42000 26404364.00000 234.24350 365153 374 2025-01-24 05:00:00 234.50000 236.40000 232.93000 234.85000 25890738.00000 234.45870 349378 375 2025-01-27 05:00:00 226.21000 235.61000 225.86000 235.42000 49428332.00000 231.81880 661486 376 2025-01-28 05:00:00 234.29000 241.77000 233.98000 238.15000 41587188.00000 238.63650 542344 377 2025-01-29 05:00:00 239.01500 240.39000 236.15000 237.07000 26091716.00000 237.41410 384889 378 2025-01-30 05:00:00 237.14000 237.95000 232.22000 234.64000 32020728.00000 235.02300 428122 379 2025-01-31 05:00:00 236.50000 240.29000 236.41000 237.68000 36162377.00000 238.20190 435388 380 2025-02-03 05:00:00 234.06000 239.25000 232.90000 237.42000 37285868.00000 236.60880 551855 381 2025-02-04 05:00:00 239.01000 242.52000 238.03000 242.06000 29713812.00000 241.16980 412003 382 2025-02-05 05:00:00 237.02000 238.32000 235.20000 236.17000 38832042.00000 236.57200 552144 383 2025-02-06 05:00:00 238.01000 239.65990 236.01000 238.83000 60897095.00000 235.75270 803119 384 2025-02-07 05:00:00 232.50000 234.81000 228.06000 229.15000 77539276.00000 230.45270 943441 385 2025-02-10 05:00:00 230.54500 233.92000 229.20000 233.14000 35419926.00000 232.41510 473182 386 2025-02-11 05:00:00 231.92000 233.44000 230.13000 232.76000 23713726.00000 232.11010 329473 387 2025-02-12 05:00:00 230.46000 231.18000 228.16000 228.93000 32285249.00000 229.60790 400325 388 2025-02-13 05:00:00 228.85000 230.42000 227.52000 230.37000 31346512.00000 229.33120 408846 389 2025-02-14 05:00:00 229.20000 229.89000 227.23000 228.68000 27031084.00000 228.65430 375369 390 2025-02-18 05:00:00 228.82000 229.30000 223.72000 226.65000 42975133.00000 225.80820 600702 391 2025-02-19 05:00:00 225.52000 226.83000 223.71000 226.63000 28566709.00000 225.37730 391423 392 2025-02-20 05:00:00 224.77500 225.13000 221.81000 222.88000 30001665.00000 222.93510 447333 393 2025-02-21 05:00:00 223.28000 223.31000 214.74000 216.58000 55323850.00000 217.71220 774966 394 2025-02-24 05:00:00 217.45000 217.71500 212.42000 212.71000 42387585.00000 214.05260 586136 395 2025-02-25 05:00:00 211.63000 213.34000 204.16000 212.80000 58957977.00000 209.69830 800750 396 2025-02-26 05:00:00 214.94000 218.16000 213.09000 214.35000 39120603.00000 215.44180 508442 397 2025-02-27 05:00:00 218.35000 219.97000 208.37000 208.74000 40548571.00000 212.37250 558788 398 2025-02-28 05:00:00 208.65000 212.62000 206.99000 212.28000 51771737.00000 210.40300 540930 399 2025-03-03 05:00:00 213.35200 214.01000 202.55000 205.02000 42948447.00000 207.41030 666781 400 2025-03-04 05:00:00 200.11000 206.80000 197.43200 203.80000 60853084.00000 201.92830 872022 401 2025-03-05 05:00:00 204.80000 209.98000 203.26000 208.36000 38610085.00000 207.05520 514659 402 2025-03-06 05:00:00 204.40000 205.77000 198.30150 200.70000 49863755.00000 201.45630 688561 403 2025-03-07 05:00:00 199.49000 202.26530 192.53000 199.25000 59802821.00000 197.66190 785672 404 2025-03-10 04:00:00 195.60000 196.73000 190.85000 194.54000 62350926.00000 193.58390 902822 405 2025-03-11 04:00:00 193.90000 200.18000 193.40000 196.59000 54002880.00000 196.82150 668549 406 2025-03-12 04:00:00 200.72000 201.52000 195.29000 198.89000 43679284.00000 198.88200 573896 407 2025-03-13 04:00:00 198.16500 198.87990 191.82000 193.89000 41270761.00000 194.33260 588538 408 2025-03-14 04:00:00 197.41000 198.65000 195.32000 197.95000 38096663.00000 197.43590 470644 409 2025-03-17 04:00:00 198.77000 199.00000 194.32470 195.74000 47341752.00000 196.46120 568556 410 2025-03-18 04:00:00 192.52000 194.00000 189.38000 192.82000 40414867.00000 192.08510 554394 411 2025-03-19 04:00:00 193.38000 195.96500 191.96000 195.54000 39442878.00000 194.28850 445698 412 2025-03-20 04:00:00 193.07000 199.32000 192.30000 194.95000 38921113.00000 195.63600 455391 413 2025-03-21 04:00:00 192.90000 196.99000 192.52000 196.21000 60056917.00000 195.30150 394836 414 2025-03-24 04:00:00 200.00000 203.64000 199.95000 203.26000 41625365.00000 202.36990 515880 415 2025-03-25 04:00:00 203.59500 206.21000 203.22000 205.71000 31171161.00000 205.10370 402368 416 2025-03-26 04:00:00 205.83500 206.01000 199.92500 201.13000 32990973.00000 202.21640 450177 417 2025-03-27 04:00:00 200.89000 203.79000 199.28210 201.36000 27317661.00000 201.85530 363321 418 2025-03-28 04:00:00 198.42000 199.26000 191.88100 192.72000 52548226.00000 193.98200 678645 419 2025-03-31 04:00:00 188.19000 191.33000 184.40000 190.26000 63547558.00000 188.28000 769862 420 2025-04-01 04:00:00 187.86000 193.93000 187.20000 192.17000 41267315.00000 191.47040 528168 421 2025-04-02 04:00:00 187.66000 198.34000 187.66000 196.01000 53679198.00000 194.28700 710850 422 2025-04-03 04:00:00 182.99500 184.13000 176.92000 178.41000 95553617.00000 180.34260 1447903 423 2025-04-04 04:00:00 167.14500 178.14360 166.00000 171.00000 123159359.00000 173.21620 1529094 424 2025-04-07 04:00:00 162.00000 183.40990 161.38000 175.26000 109327115.00000 172.66900 1420152 425 2025-04-08 04:00:00 185.23000 185.90000 168.57000 170.66000 87710360.00000 176.41280 1146023 426 2025-04-09 04:00:00 172.11500 192.65000 169.93000 191.10000 116804328.00000 182.33450 1310706 427 2025-04-10 04:00:00 185.44000 186.86920 175.85180 181.22000 68302045.00000 181.65100 925662 428 2025-04-11 04:00:00 179.93000 185.86000 178.00000 184.87000 50594339.00000 182.64530 638386 429 2025-04-14 04:00:00 186.84000 187.44000 179.23000 182.12000 48002540.00000 182.95010 656762 430 2025-04-15 04:00:00 181.41000 182.35000 177.93310 179.59000 43641952.00000 180.07010 564759 431 2025-04-16 04:00:00 176.29000 179.10460 171.41000 174.33000 51875316.00000 174.85240 685436 432 2025-04-17 04:00:00 176.00000 176.21000 172.00000 172.61000 44726453.00000 173.58080 527949 433 2025-04-21 04:00:00 169.60000 169.60000 165.28500 167.32000 48126111.00000 166.92550 726774 434 2025-04-22 04:00:00 169.84500 176.78000 169.35000 173.18000 56607202.00000 173.24700 617509 435 2025-04-23 04:00:00 183.45000 187.38000 180.19000 180.60000 63470149.00000 183.12080 732657 436 2025-04-24 04:00:00 180.91500 186.74000 180.18000 186.54000 43763196.00000 184.88010 516092 437 2025-04-25 04:00:00 187.62000 189.94000 185.49000 188.99000 36414330.00000 187.99630 489570 438 2025-04-28 04:00:00 190.10500 190.22000 184.88500 187.70000 33224732.00000 187.47010 452721 439 2025-04-29 04:00:00 183.99000 188.01580 183.68000 187.39000 41667255.00000 186.29110 507603 440 2025-04-30 04:00:00 182.17000 185.05000 178.85000 184.42000 55176543.00000 182.76580 688982 441 2025-05-01 04:00:00 190.63000 191.80711 187.50000 190.20000 74265963.00000 188.97670 890648 442 2025-05-02 04:00:00 191.43500 192.88000 186.40000 189.98000 77903487.00000 189.93000 920469 443 2025-05-05 04:00:00 186.51000 188.18000 185.53000 186.35000 35217469.00000 186.93730 458698 444 2025-05-06 04:00:00 184.57000 187.93000 183.85000 185.01000 29314055.00000 185.66190 371859 445 2025-05-07 04:00:00 185.56000 190.99000 185.01000 188.71000 44002926.00000 188.49640 499932 446 2025-05-08 04:00:00 191.43000 194.33000 188.82000 192.08000 41043620.00000 192.12660 513351 447 2025-05-09 04:00:00 193.37500 194.69000 191.16000 193.06000 29663143.00000 192.75640 362745 448 2025-05-12 04:00:00 210.71000 211.66000 205.75000 208.64000 75205042.00000 208.15580 920091 449 2025-05-13 04:00:00 211.08000 214.84000 210.10000 211.37000 56193682.00000 212.67920 743293 450 2025-05-14 04:00:00 211.45000 211.93000 208.85000 210.25000 38492128.00000 210.46140 519690 451 2025-05-15 04:00:00 206.45000 206.88000 202.67300 205.17000 64347317.00000 204.78920 821281 452 2025-05-16 04:00:00 206.85000 206.85000 204.37400 205.59000 43318478.00000 205.33360 490987 453 2025-05-19 04:00:00 201.64500 206.62000 201.26000 206.16000 34314810.00000 205.36920 425652 454 2025-05-20 04:00:00 204.62800 205.58990 202.65000 204.07000 29470373.00000 204.01080 417640 455 2025-05-21 04:00:00 201.61000 203.45500 200.06000 201.12000 42460924.00000 201.52350 569211 456 2025-05-22 04:00:00 201.38000 205.76000 200.16000 203.10000 38938882.00000 203.32730 514617 457 2025-05-23 04:00:00 198.90000 202.37000 197.85000 200.99000 33393545.00000 200.78910 483196 458 2025-05-27 04:00:00 203.08500 206.69000 202.19000 206.02000 34892044.00000 205.21560 505171 459 2025-05-28 04:00:00 205.91500 207.66000 204.41000 204.72000 28549753.00000 205.69780 407710 460 2025-05-29 04:00:00 208.02500 208.81000 204.23000 205.70000 34700005.00000 206.29430 496232 461 2025-05-30 04:00:00 204.84000 205.99000 201.69500 205.01000 51679406.00000 204.53180 493714 462 2025-06-02 04:00:00 204.98000 207.00000 202.68000 206.65000 29113319.00000 205.59400 438892 463 2025-06-03 04:00:00 207.10500 208.94690 205.03000 205.71000 33139121.00000 206.63430 436643 464 2025-06-04 04:00:00 206.55000 208.18000 205.18000 207.23000 29915592.00000 206.89200 406881 465 2025-06-05 04:00:00 209.55000 212.81000 207.56000 207.91000 51979243.00000 209.75510 668806 466 2025-06-06 04:00:00 212.40000 213.86990 210.50000 213.57000 39832500.00000 212.56720 512176 467 2025-06-09 04:00:00 214.75000 217.85000 212.88000 216.98000 38102502.00000 216.02540 560148 468 2025-06-10 04:00:00 216.78000 217.69000 214.15000 217.61000 31303317.00000 216.55610 429031 469 2025-06-11 04:00:00 217.41000 218.40000 212.89000 213.20000 39325981.00000 214.88400 503640 470 2025-06-12 04:00:00 211.78000 213.58000 211.33000 213.24000 27639991.00000 212.86550 364940 471 2025-06-13 04:00:00 209.96000 214.05000 209.62000 212.10000 29337763.00000 211.91290 443361 472 2025-06-16 04:00:00 212.31000 217.06000 211.60000 216.10000 33284158.00000 215.16220 457829 473 2025-06-17 04:00:00 215.19500 217.41000 214.56000 214.82000 32086262.00000 215.73320 431316 474 2025-06-18 04:00:00 215.09000 217.96000 212.34000 212.52000 44360509.00000 214.61350 475101 475 2025-06-20 04:00:00 214.68000 214.89000 208.27090 209.69000 75350733.00000 210.62300 591675 476 2025-06-23 04:00:00 209.79000 210.39000 207.31010 208.47000 37311725.00000 208.91300 513937 477 2025-06-24 04:00:00 212.13500 214.34000 211.04500 212.77000 38378757.00000 213.11290 459901 478 2025-06-25 04:00:00 214.61500 216.03000 211.11000 211.99000 31755698.00000 212.80660 417565 479 2025-06-26 04:00:00 213.12000 218.03500 212.01000 217.12000 50480814.00000 216.09440 572822 480 2025-06-27 04:00:00 219.92000 223.30000 216.74000 223.30000 119217138.00000 221.72670 750603 481 2025-06-30 04:00:00 223.52000 223.82000 219.12000 219.39000 58887780.00000 220.63160 673189 482 2025-07-01 04:00:00 219.50000 221.87500 217.93000 220.46000 39256830.00000 220.15080 544150 483 2025-07-02 04:00:00 219.73000 221.60000 219.06000 219.92000 30894178.00000 220.21030 429633 484 2025-07-03 04:00:00 221.82000 224.01000 221.36000 223.41000 29632353.00000 222.88670 364422 485 2025-07-07 04:00:00 223.00000 224.29000 222.37000 223.47000 36604139.00000 223.41210 513469 486 2025-07-08 04:00:00 223.91500 224.00000 218.43000 219.36000 45691987.00000 220.48360 615447 487 2025-07-09 04:00:00 221.07000 224.29000 220.47000 222.54000 38155121.00000 222.45550 493756 488 2025-07-10 04:00:00 221.55000 222.79000 219.70000 222.26000 30370591.00000 221.70550 451223 489 2025-07-11 04:00:00 223.58000 226.67990 222.37000 225.02000 50518307.00000 224.89080 661385 490 2025-07-14 04:00:00 225.07000 226.66000 224.24000 225.69000 35702597.00000 225.61730 460428 491 2025-07-15 04:00:00 226.20000 227.27000 225.45500 226.35000 34907294.00000 226.49850 507705 492 2025-07-16 04:00:00 225.87500 226.10000 222.18000 223.19000 39535926.00000 223.78720 556155 493 2025-07-17 04:00:00 223.32000 224.50000 222.51000 223.88000 31855831.00000 223.69480 445580 494 2025-07-18 04:00:00 225.14000 226.40000 222.98000 226.13000 37833807.00000 225.29060 454003 495 2025-07-21 04:00:00 225.83500 229.69000 225.65000 229.30000 40297556.00000 228.32920 530328 496 2025-07-22 04:00:00 229.68000 230.00000 226.35000 227.47000 37483702.00000 227.86620 475209 497 2025-07-23 04:00:00 228.47000 228.79000 227.09000 228.29000 28294852.00000 228.17650 344504 498 2025-07-24 04:00:00 229.17000 236.00000 228.64000 232.23000 42902266.00000 231.80220 526287 499 2025-07-25 04:00:00 232.22000 232.48000 231.18000 231.44000 28712095.00000 231.78310 365765 500 2025-07-28 04:00:00 233.35000 234.29000 232.25000 232.79000 26300138.00000 233.09970 394809 501 2025-07-29 04:00:00 234.15000 234.72000 230.31000 231.01000 33716220.00000 231.48300 446598 502 2025-07-30 04:00:00 231.64000 231.80000 229.29000 230.19000 32993273.00000 230.92400 445462 503 2025-07-31 04:00:00 235.77000 236.53000 231.40000 234.11000 104357263.00000 232.41060 1254660 504 2025-08-01 04:00:00 217.21000 220.43990 212.80000 214.75000 122258801.00000 216.29730 1742475 505 2025-08-04 04:00:00 217.40000 217.44000 211.42000 211.65000 77890146.00000 213.13120 1046525 506 2025-08-05 04:00:00 213.05000 216.30000 212.87000 213.75000 51505121.00000 214.51420 639055 507 2025-08-06 04:00:00 214.69500 222.65000 213.74090 222.31000 54823045.00000 219.42990 654274 508 2025-08-07 04:00:00 221.00000 226.22000 220.82000 223.13000 40603513.00000 223.13570 553279 509 2025-08-08 04:00:00 223.14000 223.80000 221.88360 222.69000 32970477.00000 222.66980 397504 510 2025-08-11 04:00:00 221.78000 223.05000 220.40000 221.30000 31646222.00000 221.38650 441975 511 2025-08-12 04:00:00 222.23000 223.50000 219.05000 221.47000 37254707.00000 221.41240 472607 512 2025-08-13 04:00:00 222.00000 224.91850 222.00000 224.56000 36508335.00000 223.98460 488194 513 2025-08-14 04:00:00 227.40000 233.11000 227.02000 230.98000 61545824.00000 230.50200 738785 514 2025-08-15 04:00:00 232.58000 234.08000 229.80700 231.03000 39649244.00000 231.38570 495351 515 2025-08-18 04:00:00 230.22500 231.91000 228.33000 231.49000 25248890.00000 230.57910 371228 516 2025-08-19 04:00:00 230.09000 230.52830 227.12000 228.01000 29891012.00000 228.34230 405021 517 2025-08-20 04:00:00 227.12000 227.27000 220.91500 223.81000 36604319.00000 223.67450 493752 518 2025-08-21 04:00:00 222.65000 222.78000 220.50000 221.95000 32140459.00000 221.67140 415019 519 2025-08-22 04:00:00 222.79000 229.14000 220.82000 228.84000 37315341.00000 226.88500 490974 520 2025-08-25 04:00:00 227.35000 229.60000 227.31000 227.94000 22633695.00000 228.43500 348656 521 2025-08-26 04:00:00 227.11000 229.00000 226.02000 228.71000 26105373.00000 228.17790 279949 522 2025-08-27 04:00:00 228.57000 229.87000 227.81000 229.12000 21254479.00000 228.99400 308882 523 2025-08-28 04:00:00 229.00500 232.71000 228.02000 231.60000 33679585.00000 231.33380 388917 524 2025-08-29 04:00:00 231.32000 231.81250 228.16000 229.00000 26199170.00000 229.17020 344698 525 2025-09-02 04:00:00 223.52000 226.17000 221.83000 225.34000 38843883.00000 224.54110 508912 526 2025-09-03 04:00:00 225.21000 227.16990 224.36000 225.99000 29223134.00000 225.87080 422629 527 2025-09-04 04:00:00 231.18500 235.77000 230.78000 235.68000 59391779.00000 234.06250 707968 528 2025-09-05 04:00:00 235.19000 236.00000 231.93000 232.33000 36721802.00000 233.45160 493587 529 2025-09-08 04:00:00 234.94000 237.60000 233.75000 235.84000 33947104.00000 235.91620 489481 530 2025-09-09 04:00:00 236.35500 238.85000 235.08000 238.24000 27033778.00000 237.35240 391877 531 2025-09-10 04:00:00 237.51500 237.68000 229.09620 230.33000 60907714.00000 231.86090 806586 532 2025-09-11 04:00:00 231.49000 231.53000 229.33770 229.95000 37485598.00000 230.50740 465912 533 2025-09-12 04:00:00 230.35000 230.79000 226.29000 228.15000 38496218.00000 228.43490 534413 534 2025-09-15 04:00:00 230.62500 233.73000 230.32000 231.43000 33243328.00000 231.68480 488104 535 2025-09-16 04:00:00 232.93500 235.90000 232.23000 234.05000 38203912.00000 234.40880 497761 536 2025-09-17 04:00:00 233.77000 234.30000 228.71000 231.62000 42815230.00000 231.14290 528520 537 2025-09-18 04:00:00 232.50000 233.48000 228.79000 231.23000 37931738.00000 231.52120 477593 538 2025-09-19 04:00:00 232.37000 234.16000 229.70000 231.48000 97943172.00000 231.95660 477965 539 2025-09-22 04:00:00 230.56000 230.56500 227.51000 227.63000 45914506.00000 228.54920 637304 540 2025-09-23 04:00:00 227.83000 227.86000 220.07000 220.71000 70956193.00000 222.20620 975072 541 2025-09-24 04:00:00 224.15000 224.56000 219.45000 220.21000 49509033.00000 221.00590 695256 542 2025-09-25 04:00:00 220.06000 220.67000 216.47000 218.15000 52226328.00000 218.71490 721783 543 2025-09-26 04:00:00 219.08000 221.05000 218.02000 219.78000 41650098.00000 219.73700 551565 544 2025-09-29 04:00:00 220.08000 222.60000 219.30000 222.17000 44259177.00000 221.53600 520136 545 2025-09-30 04:00:00 222.03000 222.24000 217.89000 219.57000 48396369.00000 219.40040 609847 546 2025-10-01 04:00:00 217.36000 222.15000 216.61000 220.63000 43933834.00000 220.24920 586098 547 2025-10-02 04:00:00 221.01000 222.81000 218.94500 222.41000 41258586.00000 221.31820 589620 548 2025-10-03 04:00:00 223.44000 224.20000 219.34000 219.51000 43639033.00000 221.31910 605202 549 2025-10-06 04:00:00 221.00000 221.73000 216.03000 220.90000 43690876.00000 219.70950 690825 550 2025-10-07 04:00:00 220.88000 222.89000 220.17000 221.78000 31194678.00000 221.38080 500797 551 2025-10-08 04:00:00 222.92000 226.73000 221.19000 225.22000 46685985.00000 224.52340 624651 552 2025-10-09 04:00:00 224.99500 228.21000 221.75000 227.74000 46412122.00000 225.06130 660516 553 2025-10-10 04:00:00 226.21000 228.25000 216.00000 216.37000 72367511.00000 220.25770 1095476 554 2025-10-13 04:00:00 217.70000 220.68000 217.04000 220.07000 37809650.00000 219.73840 590235 555 2025-10-14 04:00:00 215.55500 219.32000 212.60000 216.39000 45665580.00000 216.41530 713414 556 2025-10-15 04:00:00 216.62000 217.71000 212.66000 215.57000 45909469.00000 215.58160 727130 557 2025-10-16 04:00:00 215.67000 218.59000 212.81010 214.47000 42414591.00000 215.51000 681784 558 2025-10-17 04:00:00 214.56000 214.80000 211.03000 213.04000 45986944.00000 213.01650 671699 559 2025-10-20 04:00:00 213.88000 216.69000 213.59000 216.48000 38882819.00000 215.59850 561944 560 2025-10-21 04:00:00 218.43000 223.32000 217.99000 222.03000 50494565.00000 221.51540 722930 561 2025-10-22 04:00:00 219.30000 220.00500 216.52000 217.95000 44308538.00000 218.11640 608630 562 2025-10-23 04:00:00 219.00000 221.30000 218.18000 221.09000 31539699.00000 220.39800 455056 563 2025-10-24 04:00:00 221.97000 225.40000 221.90000 224.21000 38684853.00000 224.04810 547597 564 2025-10-27 04:00:00 227.66000 228.40000 225.54000 226.97000 38266995.00000 227.23260 577811 565 2025-10-28 04:00:00 228.21500 231.48500 226.21000 229.25000 47099924.00000 228.83510 682226 566 2025-10-29 04:00:00 231.67200 232.82000 227.76000 230.30000 52035936.00000 230.03650 782147 567 2025-10-30 04:00:00 227.06000 228.44000 222.75000 222.86000 102252888.00000 231.31440 1399467 568 2025-10-31 04:00:00 250.10000 250.50000 243.98000 244.22000 166340683.00000 247.39120 1873179 569 2025-11-03 05:00:00 255.36000 258.60000 252.90000 254.00000 95997714.00000 255.41880 1284945 570 2025-11-04 05:00:00 250.38000 257.01000 248.66000 249.32000 51546311.00000 250.96710 777708 571 2025-11-05 05:00:00 249.03000 251.00000 246.16000 250.20000 40610602.00000 249.26070 598483 572 2025-11-06 05:00:00 249.15500 250.38000 242.17000 243.04000 46004201.00000 244.82420 697593 573 2025-11-07 05:00:00 242.90000 244.90000 238.49000 244.41000 46374294.00000 241.84930 667465 574 2025-11-10 05:00:00 248.34000 251.75000 245.59000 248.40000 36476474.00000 248.34500 591047 575 2025-11-11 05:00:00 248.41000 249.74990 247.23000 249.10000 23563960.00000 248.64500 378251 576 2025-11-12 05:00:00 250.23500 250.37000 243.75000 244.20000 31190063.00000 245.57220 489440 577 2025-11-13 05:00:00 243.05000 243.75000 236.50000 237.58000 41401638.00000 239.23370 614028 578 2025-11-14 05:00:00 235.06000 238.73000 232.89000 234.69000 38956619.00000 235.71950 630487 579 2025-11-17 05:00:00 233.25000 234.60000 229.19000 232.87000 59918908.00000 231.89100 893852 We can see that the index is not continuous - but this is not an issue because use of the data would likely need to re-index the data or simply set the date column as the index.\nReferences https://polygon.io/ https://polygon.io/docs/rest/quickstart Code The jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2025-08-10T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2025/08/10/data-pipelining-with-polygon/data-pipelining-with-polygon_original2.png","permalink":"https://www.jaredszajkowski.com/stack/2025/08/10/data-pipelining-with-polygon/","title":"Data Pipelining With Polygon"},{"content":"Introduction This is a quick post to illustrate how I collect and store crypto asset data from Coinbase. Essentially, the scripts below pull minute, hour, and daily data for the specified assets and if there is an existing data record, then the existing record is updated to include the most recent data. If there is not an existing data record, then the complete historical record from coinbase is pulled and stored.\nPython Functions Here are the functions needed for this project:\ncoinbase_fetch_available_products: Fetch available products from Coinbase Exchange API. coinbase_fetch_full_history: Fetch full historical data for a given product from Coinbase Exchange API. coinbase_fetch_historical_candles: Fetch historical candle data for a given product from Coinbase Exchange API. coinbase_pull_data: Update existing record or pull full historical data for a given product from Coinbase Exchange API. Function Usage Coinbase Fetch Available Products This script pulls the list of available assets based on the inputs for base and quote currency. Here\u0026rsquo;s an example:\n1 2 3 4 5 df = coinbase_fetch_available_products( base_currency=None, quote_currency=\u0026#34;USD\u0026#34;, status=\u0026#34;online\u0026#34;, ) In this example, the quote_currency is provided as \u0026ldquo;USD\u0026rdquo;. This script checks all available assets that are priced against USD and returns a dataframe listing all available assets:\nid base_currency quote_currency quote_increment base_increment display_name min_market_funds margin_enabled post_only limit_only cancel_only status status_message trading_disabled fx_stablecoin max_slippage_percentage auction_mode high_bid_limit_percentage 196 00-USD 00 USD 0.00010 0.01000 00-USD 1 False False False False online False False 0.03000 False 360 1INCH-USD 1INCH USD 0.00100 0.01000 1INCH-USD 1 False False False False online False False 0.03000 False 76 A8-USD A8 USD 0.00010 0.01000 A8/USD 1 False False False False online False False 0.03000 False 609 AAVE-USD AAVE USD 0.01000 0.00100 AAVE-USD 1 False False False False online False False 0.03000 False 387 ABT-USD ABT USD 0.00010 0.10000 ABT-USD 1 False False False False online False False 0.03000 False 49 ACH-USD ACH USD 0.00000 0.10000 ACH-USD 1 False False False False online False False 0.03000 False 308 ACS-USD ACS USD 0.00000 1.00000 ACS-USD 1 False False False False online False False 0.03000 False 44 ACX-USD ACX USD 0.00010 0.10000 ACX/USD 1 False False False False online False False 0.03000 False 212 ADA-USD ADA USD 0.00010 0.00000 ADA-USD 1 False False False False online False False 0.03000 False 461 AERGO-USD AERGO USD 0.00010 0.10000 AERGO-USD 1 False False False False online False False 0.03000 False 491 AERO-USD AERO USD 0.00001 0.10000 AERO-USD 1 False False False False online False False 0.03000 False 119 AGLD-USD AGLD USD 0.00010 0.01000 AGLD-USD 1 False False False False online False False 0.03000 False 591 AIOZ-USD AIOZ USD 0.00010 0.10000 AIOZ-USD 1 False False False False online False False 0.03000 False 453 AKT-USD AKT USD 0.00100 0.01000 AKT/USD 1 False False False False online False False 0.03000 False 613 ALCX-USD ALCX USD 0.01000 0.00010 ALCX-USD 1 False False False False online False False 0.03000 False 70 ALEO-USD ALEO USD 0.00100 0.01000 ALEO/USD 1 False False False False online False False 0.03000 False 679 ALEPH-USD ALEPH USD 0.00010 0.10000 ALEPH-USD 1 False False False False online False False 0.03000 False 194 ALGO-USD ALGO USD 0.00010 0.10000 ALGO-USD 1 False False False False online False False 0.03000 False 38 ALICE-USD ALICE USD 0.00100 0.00100 ALICE-USD 1 False False False False online False False 0.03000 False 135 ALT-USD ALT USD 0.00001 1.00000 ALT/USD 1 False False False False online False False 0.03000 False 610 AMP-USD AMP USD 0.00001 1.00000 AMP-USD 1 False False False False online False False 0.03000 False 573 ANKR-USD ANKR USD 0.00001 1.00000 ANKR-USD 1 False False False False online False False 0.03000 False 129 APE-USD APE USD 0.00100 0.01000 APE-USD 1 False False False False online False False 0.03000 False 431 API3-USD API3 USD 0.00100 0.01000 API3-USD 5 False False False False online False False 0.03000 False 96 APT-USD APT USD 0.01000 0.00100 APT-USD 1 False False False False online False False 0.03000 False 663 ARB-USD ARB USD 0.00010 0.01000 ARB-USD 1 False False False False online False False 0.03000 False 604 ARKM-USD ARKM USD 0.00100 0.01000 ARKM/USD 1 False False False False online False False 0.03000 False 410 ARPA-USD ARPA USD 0.00010 0.10000 ARPA-USD 1 False False False False online False False 0.03000 False 304 ASM-USD ASM USD 0.00001 1.00000 ASM-USD 1 False False False False online False False 0.03000 False 90 AST-USD AST USD 0.00010 0.10000 AST-USD 1 False False False False online False False 0.03000 False 341 ATH-USD ATH USD 0.00001 1.00000 ATH/USD 1 False False False False online False False 0.03000 False 293 ATOM-USD ATOM USD 0.00100 0.01000 ATOM-USD 1 False False False False online False False 0.03000 False 164 AUCTION-USD AUCTION USD 0.01000 0.00100 AUCTION-USD 1 False False False False online False False 0.03000 False 229 AUDIO-USD AUDIO USD 0.00010 0.10000 AUDIO-USD 1 False False False False online False False 0.03000 False 508 AURORA-USD AURORA USD 0.00010 0.01000 AURORA-USD 1 False False False False online False False 0.03000 False 134 AVAX-USD AVAX USD 0.01000 0.00000 AVAX-USD 1 False False False False online False False 0.03000 False 280 AVT-USD AVT USD 0.01000 0.01000 AVT-USD 1 False False False False online False False 0.05000 False 535 AXL-USD AXL USD 0.00010 0.10000 AXL-USD 1 False False False False online False False 0.03000 False 142 AXS-USD AXS USD 0.00100 0.00100 AXS-USD 1 False False False False online False False 0.03000 False 88 B3-USD B3 USD 0.00000 1.00000 B3/USD 1 False False False False online False False 0.03000 False 475 BADGER-USD BADGER USD 0.01000 0.00100 BADGER-USD 1 False False False False online False False 0.03000 False 596 BAL-USD BAL USD 0.00010 0.00100 BAL-USD 1 False False False False online False False 0.03000 False 377 BAND-USD BAND USD 0.00100 0.01000 BAND-USD 1 False False False False online False False 0.03000 False 437 BAT-USD BAT USD 0.00001 0.01000 BAT-USD 1 False False False False online False False 0.03000 False 673 BCH-USD BCH USD 0.01000 0.00000 BCH-USD 1 False False False False online False False 0.03000 False 632 BERA-USD BERA USD 0.00100 0.01000 BERA/USD 1 False False False False online False False 0.03000 False 120 BICO-USD BICO USD 0.00010 0.01000 BICO-USD 1 False False False False online False False 0.03000 False 66 BIGTIME-USD BIGTIME USD 0.00001 1.00000 BIGTIME-USD 1 False False False False online False False 0.03000 False 41 BLAST-USD BLAST USD 0.00001 1.00000 BLAST/USD 1 False False False False online False False 0.03000 False 272 BLUR-USD BLUR USD 0.00010 0.10000 BLUR-USD 1 False False False False online False False 0.03000 False 653 BLZ-USD BLZ USD 0.00010 0.10000 BLZ-USD 1 False False False False online False False 0.03000 False 396 BNT-USD BNT USD 0.00010 0.00000 BNT-USD 1 False False False False online False False 0.03000 False 336 BOBA-USD BOBA USD 0.00010 0.10000 BOBA-USD 1 False False False False online False False 0.03000 False 69 BONK-USD BONK USD 0.00000 1.00000 BONK-USD 1 False False False False online False False 0.03000 False 356 BTC-USD BTC USD 0.01000 0.00000 BTC-USD 1 False False False False online False False 0.02000 False 121 BTRST-USD BTRST USD 0.00100 0.01000 BTRST-USD 1 False False False False online False False 0.03000 False 5 C98-USD C98 USD 0.00010 0.01000 C98-USD 1 False False False False online False False 0.03000 False 546 CAKE-USD CAKE USD 0.00100 0.01000 CAKE/USD 1 False False False False online False False 0.03000 False 318 CBETH-USD CBETH USD 0.01000 0.00001 cbETH-USD 1 False False False False online False False 0.03000 False 452 CELR-USD CELR USD 0.00001 1.00000 CELR-USD 1 False False False False online False False 0.03000 False 389 CGLD-USD CGLD USD 0.00100 0.01000 CGLD-USD 1 False False False False online False False 0.03000 False 432 CHZ-USD CHZ USD 0.00010 0.10000 CHZ-USD 1 False False False False online False False 0.03000 False 245 CLANKER-USD CLANKER USD 0.01000 0.00010 CLANKER/USD 1 False False False False online False False 0.03000 False 501 CLV-USD CLV USD 0.00010 0.01000 CLV-USD 1 False False False False online False False 0.03000 False 503 COMP-USD COMP USD 0.01000 0.00100 COMP-USD 1 False False False False online False False 0.03000 False 265 COOKIE-USD COOKIE USD 0.00001 0.10000 COOKIE/USD 1 False False False False online False False 0.03000 False 59 CORECHAIN-USD CORECHAIN USD 0.00100 0.01000 CORECHAIN/USD 1 False False False False online False False 0.03000 False 527 COTI-USD COTI USD 0.00010 0.10000 COTI-USD 1 False False False False online False False 0.03000 False 56 COW-USD COW USD 0.00010 0.10000 COW/USD 1 False False False False online False False 0.03000 False 278 CRO-USD CRO USD 0.00010 0.10000 CRO-USD 1 False False False False online False False 0.03000 False 151 CRV-USD CRV USD 0.00010 0.01000 CRV-USD 1 False False False False online False False 0.03000 False 477 CTSI-USD CTSI USD 0.00010 0.10000 CTSI-USD 1 False False False False online False False 0.03000 False 429 CTX-USD CTX USD 0.00010 0.00100 CTX-USD 5 False False False False online False False 0.05000 False 551 CVC-USD CVC USD 0.00010 0.10000 CVC-USD 1 False False False False online False False 0.03000 False 148 CVX-USD CVX USD 0.00100 0.00100 CVX-USD 1 False False False False online False False 0.03000 False 484 DAI-USD DAI USD 0.00010 0.00001 DAI-USD 1 False False False False online False True 0.01000 False 0.03000000 126 DASH-USD DASH USD 0.01000 0.00100 DASH-USD 1 False False False False online False False 0.03000 False 17 DEGEN-USD DEGEN USD 0.00000 1.00000 DEGEN/USD 1 False False False False online False False 0.03000 False 693 DEXT-USD DEXT USD 0.00010 0.10000 DEXT-USD 1 False False False False online False False 0.05000 False 338 DIA-USD DIA USD 0.00001 0.01000 DIA-USD 1 False False False False online False False 0.03000 False 530 DIMO-USD DIMO USD 0.00001 0.10000 DIMO-USD 1 False False False False online False False 0.03000 False 18 DNT-USD DNT USD 0.00010 0.10000 DNT-USD 1 False False False False online False False 0.05000 False 14 DOGE-USD DOGE USD 0.00001 0.10000 DOGE-USD 1 False False False False online False False 0.03000 False 638 DOGINME-USD DOGINME USD 0.00000 1.00000 DOGINME/USD 1 False False False False online False False 0.03000 False 395 DOT-USD DOT USD 0.00100 0.00000 DOT-USD 1 False False False False online False False 0.03000 False 36 DRIFT-USD DRIFT USD 0.00100 0.01000 DRIFT/USD 1 False False False False online False False 0.03000 False 238 EDGE-USD EDGE USD 0.00001 0.10000 EDGE/USD 1 False False False False online False False 0.03000 False 25 EGLD-USD EGLD USD 0.01000 0.00100 EGLD-USD 1 False False False False online False False 0.03000 False 75 EIGEN-USD EIGEN USD 0.00100 0.01000 EIGEN/USD 1 False False False False online False False 0.03000 False 87 ELA-USD ELA USD 0.00100 0.01000 ELA-USD 1 False False False False online False False 0.03000 False 659 ENA-USD ENA USD 0.00010 0.10000 ENA/USD 1 False False False False online False False 0.03000 False 189 ENS-USD ENS USD 0.01000 0.00100 ENS-USD 1 False False False False online False False 0.03000 False 506 EOS-USD EOS USD 0.00010 0.10000 EOS-USD 1 False False False False online False False 0.03000 False 182 ERA-USD ERA USD 0.00010 0.01000 ERA/USD 1 False False False False online False False 0.03000 False 322 ERN-USD ERN USD 0.00010 0.00100 ERN-USD 1 False False False False online False False 0.03000 False 30 ETC-USD ETC USD 0.01000 0.00000 ETC-USD 1 False False False False online False False 0.03000 False 522 ETH-USD ETH USD 0.01000 0.00000 ETH-USD 1 False False False False online False False 0.02000 False 388 ETHFI-USD ETHFI USD 0.00100 0.01000 ETHFI/USD 1 False False False False online False False 0.03000 False 539 FAI-USD FAI USD 0.00001 1.00000 FAI/USD 1 False False False False online False False 0.03000 False 94 FARM-USD FARM USD 0.01000 0.00100 FARM-USD 1 False False False False online False False 0.03000 False 386 FARTCOIN-USD FARTCOIN USD 0.00010 0.01000 FARTCOIN/USD 1 False False False False online False False 0.03000 False 513 FET-USD FET USD 0.00010 0.10000 FET-USD 1 False False False False online False False 0.03000 False 216 FIDA-USD FIDA USD 0.00010 0.01000 FIDA-USD 1 False False False False online False False 0.03000 False 240 FIL-USD FIL USD 0.00100 0.00100 FIL-USD 1 False False False False online False False 0.03000 False 558 FIS-USD FIS USD 0.00010 0.10000 FIS-USD 1 False False False False online False False 0.03000 False 198 FLOKI-USD FLOKI USD 0.00000 1.00000 FLOKI/USD 1 False False False False online False False 0.03000 False 64 FLOW-USD FLOW USD 0.00100 0.00100 FLOW-USD 1 False False False False online False False 0.03000 False 379 FLR-USD FLR USD 0.00001 1.00000 FLR-USD 1 False False False False online False False 0.03000 False 637 FORT-USD FORT USD 0.00010 0.01000 FORT-USD 1 False False False False online False False 0.03000 False 314 FORTH-USD FORTH USD 0.00010 0.00100 FORTH-USD 1 False False False False online False False 0.03000 False 276 FOX-USD FOX USD 0.00010 0.10000 FOX-USD 1 False False False False online False False 0.03000 False 20 FX-USD FX USD 0.00010 0.10000 FX-USD 1 False False False False online False False 0.03000 False 104 G-USD G USD 0.00001 1.00000 G/USD 1 False False False False online False False 0.03000 False 488 GFI-USD GFI USD 0.00010 0.01000 GFI-USD 1 False False False False online False False 0.03000 False 161 GHST-USD GHST USD 0.00100 0.01000 GHST-USD 1 False False False False online False False 0.03000 False 683 GIGA-USD GIGA USD 0.00001 0.10000 GIGA/USD 1 False False False False online False False 0.03000 False 400 GLM-USD GLM USD 0.00010 0.10000 GLM-USD 1 False False False False online False False 0.03000 False 375 GMT-USD GMT USD 0.00010 0.01000 GMT-USD 1 False False False False online False False 0.03000 False 478 GNO-USD GNO USD 0.01000 0.00010 GNO-USD 1 False False True False online False False 0.03000 False 98 GODS-USD GODS USD 0.00001 0.01000 GODS-USD 1 False False False False online False False 0.03000 False 542 GRT-USD GRT USD 0.00010 0.01000 GRT-USD 1 False False False False online False False 0.03000 False 584 GST-USD GST USD 0.00000 0.01000 GST-USD 1 False False False False online False False 0.03000 False 209 GTC-USD GTC USD 0.01000 0.01000 GTC-USD 1 False False False False online False False 0.05000 False 443 HBAR-USD HBAR USD 0.00001 0.10000 HBAR-USD 1 False False False False online False False 0.03000 False 699 HFT-USD HFT USD 0.00010 0.01000 HFT-USD 1 False False False False online False False 0.03000 False 303 HIGH-USD HIGH USD 0.00100 0.01000 HIGH-USD 1 False False False False online False False 0.03000 False 685 HNT-USD HNT USD 0.00100 0.01000 HNT-USD 1 False False False False online False False 0.03000 False 547 HOME-USD HOME USD 0.00001 1.00000 HOME/USD 1 False False False False online False False 0.03000 False 352 HONEY-USD HONEY USD 0.00010 0.10000 HONEY-USD 1 False False False False online False False 0.03000 False 112 HOPR-USD HOPR USD 0.00010 0.10000 HOPR-USD 1 False False False False online False False 0.03000 False 654 ICP-USD ICP USD 0.00100 0.00010 ICP-USD 1 False False False False online False False 0.03000 False 595 IDEX-USD IDEX USD 0.00010 0.10000 IDEX-USD 1 False False False False online False False 0.03000 False 371 ILV-USD ILV USD 0.01000 0.00010 ILV-USD 1 False False False False online False False 0.03000 False 552 IMX-USD IMX USD 0.00010 0.01000 IMX-USD 1 False False False False online False False 0.03000 False 651 INDEX-USD INDEX USD 0.01000 0.00100 INDEX-USD 1 False False False False online False False 0.03000 False 225 INJ-USD INJ USD 0.00100 0.01000 INJ-USD 1 False False False False online False False 0.03000 False 183 INV-USD INV USD 0.01000 0.00010 INV-USD 1 False False True False online False False 0.05000 False 570 IO-USD IO USD 0.00100 0.01000 IO/USD 1 False False False False online False False 0.03000 False 115 IOTX-USD IOTX USD 0.00001 1.00000 IOTX-USD 1 False False False False online False False 0.03000 False 560 IP-USD IP USD 0.00100 0.01000 IP/USD 1 False False False False online False False 0.03000 False 433 JASMY-USD JASMY USD 0.00001 1.00000 JASMY-USD 1 False False False False online False False 0.03000 False 483 JITOSOL-USD JITOSOL USD 0.01000 0.00010 JITOSOL/USD 1 False False True False online False False 0.03000 False 28 JTO-USD JTO USD 0.00010 0.10000 JTO-USD 1 False False False False online False False 0.03000 False 62 KAITO-USD KAITO USD 0.00010 0.01000 KAITO/USD 1 False False False False online False False 0.03000 False 286 KARRAT-USD KARRAT USD 0.00010 0.01000 KARRAT/USD 1 False False False False online False False 0.03000 False 217 KAVA-USD KAVA USD 0.00010 0.01000 KAVA-USD 1 False False False False online False False 0.03000 False 509 KERNEL-USD KERNEL USD 0.00010 0.01000 KERNEL/USD 1 False False False False online False False 0.03000 False 63 KEYCAT-USD KEYCAT USD 0.00000 1.00000 KEYCAT/USD 1 False False False False online False False 0.03000 False 197 KNC-USD KNC USD 0.00010 0.10000 KNC-USD 1 False False False False online False False 0.03000 False 463 KRL-USD KRL USD 0.00010 0.10000 KRL-USD 1 False False False False online False False 0.05000 False 324 KSM-USD KSM USD 0.01000 0.00010 KSM-USD 1 False False False False online False False 0.03000 False 372 L3-USD L3 USD 0.00001 0.10000 L3/USD 1 False False False False online False False 0.03000 False 211 LA-USD LA USD 0.00010 0.10000 LA/USD 1 False False False False online False False 0.03000 False 137 LCX-USD LCX USD 0.00010 0.10000 LCX-USD 1 False False False False online False False 0.03000 False 68 LDO-USD LDO USD 0.00100 0.01000 LDO-USD 1 False False False False online False False 0.03000 False 157 LINK-USD LINK USD 0.00100 0.01000 LINK-USD 1 False False False False online False False 0.03000 False 661 LOKA-USD LOKA USD 0.00010 0.01000 LOKA-USD 1 False False False False online False False 0.03000 False 325 LPT-USD LPT USD 0.01000 0.00100 LPT-USD 1 False False False False online False False 0.03000 False 534 LQTY-USD LQTY USD 0.00010 0.01000 LQTY-USD 1 False False False False online False False 0.03000 False 199 LRC-USD LRC USD 0.00010 0.00000 LRC-USD 1 False False False False online False False 0.03000 False 523 LRDS-USD LRDS USD 0.00010 0.01000 LRDS/USD 1 False False False False online False False 0.03000 False 311 LSETH-USD LSETH USD 0.01000 0.00001 LSETH-USD 1 False False False False online False False 0.03000 False 144 LTC-USD LTC USD 0.01000 0.00000 LTC-USD 1 False False False False online False False 0.03000 False 269 MAGIC-USD MAGIC USD 0.00010 0.01000 MAGIC-USD 1 False False False False online False False 0.03000 False 455 MANA-USD MANA USD 0.00010 0.01000 MANA-USD 1 False False False False online False False 0.03000 False 668 MANTLE-USD MANTLE USD 0.00010 0.01000 MANTLE/USD 1 False False False False online False False 0.03000 False 367 MASK-USD MASK USD 0.01000 0.01000 MASK-USD 1 False False False False online False False 0.05000 False 418 MATH-USD MATH USD 0.00010 0.10000 MATH-USD 1 False False False False online False False 0.03000 False 566 MATIC-USD MATIC USD 0.00010 0.10000 MATIC-USD 1 False False False False online False False 0.03000 False 8 MDT-USD MDT USD 0.00001 1.00000 MDT-USD 1 False False False False online False False 0.03000 False 35 ME-USD ME USD 0.00100 0.01000 ME/USD 1 False False False False online False False 0.03000 False 588 METIS-USD METIS USD 0.01000 0.00100 METIS-USD 1 False False False False online False False 0.03000 False 315 MINA-USD MINA USD 0.00100 0.00100 MINA-USD 1 False False False False online False False 0.03000 False 493 MKR-USD MKR USD 0.01000 0.00000 MKR-USD 1 False False False False online False False 0.03000 False 617 MLN-USD MLN USD 0.01000 0.00100 MLN-USD 1 False False False False online False False 0.03000 False 348 MNDE-USD MNDE USD 0.00001 0.10000 MNDE-USD 1 False False False False online False False 0.03000 False 446 MOG-USD MOG USD 0.00000 1.00000 MOG/USD 1 False False False False online False False 0.03000 False 92 MOODENG-USD MOODENG USD 0.00010 0.01000 MOODENG/USD 1 False False False False online False False 0.03000 False 413 MORPHO-USD MORPHO USD 0.00010 0.01000 MORPHO/USD 1 False False False False online False False 0.03000 False 439 MPLX-USD MPLX USD 0.00010 0.10000 MPLX/USD 1 False False False False online False False 0.03000 False 598 MSOL-USD MSOL USD 0.01000 0.00100 MSOL-USD 1 False False False False online False False 0.03000 False 117 MUSE-USD MUSE USD 0.00100 0.00100 MUSE-USD 1 False False False False online False False 0.05000 False 305 NCT-USD NCT USD 0.00001 1.00000 NCT-USD 1 False False False False online False False 0.03000 False 624 NEAR-USD NEAR USD 0.00100 0.00100 NEAR-USD 1 False False False False online False False 0.03000 False 24 NEON-USD NEON USD 0.00001 0.01000 NEON/USD 1 False False False False online False False 0.03000 False 628 NEWT-USD NEWT USD 0.00010 0.01000 NEWT/USD 1 False False False False online False False 0.03000 False 223 NKN-USD NKN USD 0.00010 0.10000 NKN-USD 1 False False False False online False False 0.03000 False 675 NMR-USD NMR USD 0.01000 0.00100 NMR-USD 1 False False False False online False False 0.03000 False 406 OCEAN-USD OCEAN USD 0.00010 0.10000 OCEAN-USD 1 False False False False online False False 0.03000 False 132 OGN-USD OGN USD 0.00001 0.01000 OGN-USD 1 False False False False online False False 0.03000 False 469 OMNI-USD OMNI USD 0.00100 0.01000 OMNI/USD 1 False False False False online False False 0.03000 False 141 ONDO-USD ONDO USD 0.00001 0.01000 ONDO-USD 1 False False False False online False False 0.03000 False 113 OP-USD OP USD 0.00100 0.01000 OP-USD 1 False False False False online False False 0.03000 False 640 ORCA-USD ORCA USD 0.00010 0.01000 ORCA-USD 1 False False False False online False False 0.03000 False 473 OSMO-USD OSMO USD 0.00010 0.01000 OSMO-USD 1 False False False False online False False 0.03000 False 355 OXT-USD OXT USD 0.00010 1.00000 OXT-USD 1 False False False False online False False 0.03000 False 252 PAX-USD PAX USD 0.00010 0.01000 PAX-USD 1 False False True False online False True 0.01000 False 0.03000000 420 PAXG-USD PAXG USD 0.01000 0.00001 PAXG-USD 1 False False True False online False False 0.03000 False 365 PENDLE-USD PENDLE USD 0.00100 0.01000 PENDLE/USD 1 False False False False online False False 0.03000 False 176 PENGU-USD PENGU USD 0.00001 1.00000 PENGU/USD 1 False False False False online False False 0.03000 False 57 PEPE-USD PEPE USD 0.00000 1.00000 PEPE/USD 1 False False False False online False False 0.03000 False 6 PERP-USD PERP USD 0.00010 0.00100 PERP-USD 1 False False False False online False False 0.03000 False 448 PIRATE-USD PIRATE USD 0.00010 0.10000 PIRATE/USD 1 False False False False online False False 0.03000 False 187 PLU-USD PLU USD 0.01000 0.01000 PLU-USD 1 False False False False online False False 0.03000 False 662 PNG-USD PNG USD 0.00001 1.00000 PNG-USD 1 False False False False online False False 0.05000 False 166 PNUT-USD PNUT USD 0.00010 0.01000 PNUT/USD 1 False False False False online False False 0.03000 False 460 POL-USD POL USD 0.00010 0.01000 POL/USD 1 False False False False online False False 0.03000 False 40 POLS-USD POLS USD 0.00010 0.01000 POLS-USD 1 False False False False online False False 0.03000 False 457 POND-USD POND USD 0.00001 1.00000 POND-USD 1 False False False False online False False 0.03000 False 1 POPCAT-USD POPCAT USD 0.00010 0.01000 POPCAT/USD 1 False False False False online False False 0.03000 False 255 POWR-USD POWR USD 0.00010 0.10000 POWR-USD 1 False False False False online False False 0.03000 False 464 PRCL-USD PRCL USD 0.00010 0.10000 PRCL/USD 1 False False False False online False False 0.03000 False 232 PRIME-USD PRIME USD 0.00100 0.01000 PRIME-USD 1 False False False False online False False 0.03000 False 526 PRO-USD PRO USD 0.00010 0.01000 PRO-USD 1 False False False False online False False 0.05000 False 102 PROMPT-USD PROMPT USD 0.00001 0.10000 PROMPT/USD 1 False False False False online False False 0.03000 False 436 PUMP-USD PUMP USD 0.00000 1.00000 PUMP/USD 1 False False False False online False False 0.03000 False 645 PUNDIX-USD PUNDIX USD 0.00010 0.01000 PUNDIX-USD 1 False False False False online False False 0.03000 False 543 PYR-USD PYR USD 0.00100 0.01000 PYR-USD 1 False False False False online False False 0.03000 False 690 PYTH-USD PYTH USD 0.00010 0.10000 PYTH/USD 1 False False False False online False False 0.03000 False 385 QI-USD QI USD 0.00000 1.00000 QI-USD 1 False False False False online False False 0.03000 False 618 QNT-USD QNT USD 0.01000 0.00100 QNT-USD 1 False False False False online False False 0.03000 False 382 RAD-USD RAD USD 0.01000 0.01000 RAD-USD 1 False False False False online False False 0.03000 False 206 RARE-USD RARE USD 0.00010 0.10000 RARE-USD 1 False False False False online False False 0.03000 False 594 RARI-USD RARI USD 0.00010 0.00100 RARI-USD 1 False False False False online False False 0.03000 False 572 RED-USD RED USD 0.00010 0.01000 RED/USD 1 False False False False online False False 0.03000 False 486 RENDER-USD RENDER USD 0.00100 0.01000 RENDER/USD 1 False False False False online False False 0.03000 False 599 REQ-USD REQ USD 0.00010 1.00000 REQ-USD 1 False False False False online False False 0.03000 False 681 REZ-USD REZ USD 0.00001 1.00000 REZ/USD 1 False False False False online False False 0.03000 False 616 RLC-USD RLC USD 0.00010 0.01000 RLC-USD 1 False False False False online False False 0.03000 False 447 RONIN-USD RONIN USD 0.00100 0.01000 RONIN/USD 1 False False False False online False False 0.03000 False 495 ROSE-USD ROSE USD 0.00001 0.10000 ROSE-USD 1 False False False False online False False 0.03000 False 380 RPL-USD RPL USD 0.01000 0.00100 RPL-USD 1 False False False False online False False 0.05000 False 284 RSR-USD RSR USD 0.00000 1.00000 RSR/USD 1 False False False False online False False 0.03000 False 29 S-USD S USD 0.00001 0.10000 S/USD 1 False False False False online False False 0.03000 False 221 SAFE-USD SAFE USD 0.00010 0.01000 SAFE/USD 1 False False False False online False False 0.03000 False 392 SAND-USD SAND USD 0.00010 0.01000 SAND-USD 1 False False False False online False False 0.03000 False 312 SD-USD SD USD 0.00010 0.01000 SD/USD 1 False False False False online False False 0.03000 False 279 SEAM-USD SEAM USD 0.00010 0.10000 SEAM-USD 1 False False False False online False False 0.03000 False 601 SEI-USD SEI USD 0.00010 0.10000 SEI-USD 1 False False False False online False False 0.03000 False 459 SHDW-USD SHDW USD 0.00100 0.01000 SHDW/USD 1 False False False False online False False 0.03000 False 404 SHIB-USD SHIB USD 0.00000 1.00000 SHIB-USD 1 False False False False online False False 0.03000 False 476 SHPING-USD SHPING USD 0.00000 1.00000 SHPING-USD 1 False False False False online False False 0.03000 False 581 SKL-USD SKL USD 0.00010 0.10000 SKL-USD 1 False False False False online False False 0.03000 False 83 SKY-USD SKY USD 0.00001 0.10000 SKY/USD 1 False False False False online False False 0.03000 False 607 SNX-USD SNX USD 0.00100 0.00100 SNX-USD 1 False False False False online False False 0.03000 False 471 SOL-USD SOL USD 0.01000 0.00000 SOL-USD 1 False False False False online False False 0.03000 False 291 SPA-USD SPA USD 0.00000 1.00000 SPA-USD 1 False False False False online False False 0.03000 False 697 SPELL-USD SPELL USD 0.00000 1.00000 SPELL-USD 1 False False False False online False False 0.03000 False 674 SPK-USD SPK USD 0.00001 0.10000 SPK/USD 1 False False False False online False False 0.03000 False 416 SQD-USD SQD USD 0.00010 0.10000 SQD/USD 1 False False False False online False False 0.03000 False 342 STG-USD STG USD 0.00010 0.10000 STG-USD 1 False False False False online False False 0.03000 False 55 STORJ-USD STORJ USD 0.00010 0.01000 STORJ-USD 1 False False False False online False False 0.03000 False 434 STRK-USD STRK USD 0.00100 0.01000 STRK/USD 1 False False False False online False False 0.03000 False 152 STX-USD STX USD 0.00010 0.01000 STX-USD 1 False False False False online False False 0.03000 False 603 SUI-USD SUI USD 0.00010 0.10000 SUI-USD 1 False False False False online False False 0.03000 False 694 SUKU-USD SUKU USD 0.00010 0.10000 SUKU-USD 1 False False False False online False False 0.03000 False 419 SUPER-USD SUPER USD 0.00001 0.01000 SUPER-USD 1 False False False False online False False 0.03000 False 207 SUSHI-USD SUSHI USD 0.00010 0.01000 SUSHI-USD 1 False False False False online False False 0.03000 False 465 SWELL-USD SWELL USD 0.00001 1.00000 SWELL/USD 1 False False False False online False False 0.03000 False 231 SWFTC-USD SWFTC USD 0.00000 1.00000 SWFTC-USD 1 False False False False online False False 0.03000 False 204 SXT-USD SXT USD 0.00010 0.10000 SXT/USD 1 False False False False online False False 0.03000 False 329 SYRUP-USD SYRUP USD 0.00010 0.10000 SYRUP/USD 1 False False False False online False False 0.03000 False 259 T-USD T USD 0.00001 1.00000 T-USD 1 False False False False online False False 0.03000 False 691 TAO-USD TAO USD 0.01000 0.00010 TAO/USD 1 False False True False online False False 0.03000 False 127 TIA-USD TIA USD 0.00100 0.01000 TIA-USD 1 False False False False online False False 0.03000 False 374 TIME-USD TIME USD 0.01000 0.00100 TIME-USD 1 False False True False online False False 0.05000 False 264 TNSR-USD TNSR USD 0.00100 0.01000 TNSR/USD 1 False False False False online False False 0.03000 False 281 TOSHI-USD TOSHI USD 0.00000 1.00000 TOSHI/USD 1 False False False False online False False 0.03000 False 499 TRAC-USD TRAC USD 0.00010 0.10000 TRAC-USD 1 False False False False online False False 0.03000 False 626 TRB-USD TRB USD 0.01000 0.00100 TRB-USD 1 False False False False online False False 0.03000 False 479 TRU-USD TRU USD 0.00010 0.10000 TRU-USD 1 False False False False online False False 0.03000 False 696 TRUMP-USD TRUMP USD 0.01000 0.00100 TRUMP/USD 1 False False False False online False False 0.03000 False 205 TURBO-USD TURBO USD 0.00000 1.00000 TURBO/USD 1 False False False False online False False 0.03000 False 652 UMA-USD UMA USD 0.00100 0.00100 UMA-USD 1 False False False False online False False 0.03000 False 97 UNI-USD UNI USD 0.00100 0.00000 UNI-USD 1 False False False False online False False 0.03000 False 188 USDS-USD USDS USD 0.00010 0.01000 USDS/USD 1 False False True False online False True 0.01000 False 0.03000000 442 USDT-USD USDT USD 0.00001 0.01000 USDT-USD 1 False False False False online False False 0.01000 False 0.03000000 210 VARA-USD VARA USD 0.00001 1.00000 VARA-USD 1 False False False False online False False 0.03000 False 26 VELO-USD VELO USD 0.00001 0.10000 VELO-USD 1 False False False False online False False 0.03000 False 169 VET-USD VET USD 0.00001 1.00000 VET-USD 1 False False False False online False False 0.03000 False 48 VOXEL-USD VOXEL USD 0.00010 0.01000 VOXEL-USD 1 False False False False online False False 0.03000 False 657 VTHO-USD VTHO USD 0.00000 1.00000 VTHO-USD 1 False False False False online False False 0.03000 False 450 VVV-USD VVV USD 0.01000 0.00100 VVV-USD 1 False False False False online False False 0.03000 False 236 W-USD W USD 0.00001 0.01000 W/USD 1 False False False False online False False 0.03000 False 154 WAXL-USD WAXL USD 0.00010 0.01000 WAXL-USD 1 False False False False online False False 0.03000 False 706 WCFG-USD WCFG USD 0.00100 0.01000 WCFG-USD 1 False False False False online False False 0.03000 False 498 WELL-USD WELL USD 0.00000 1.00000 WELL/USD 1 False False False False online False False 0.03000 False 10 WIF-USD WIF USD 0.00100 0.01000 WIF/USD 1 False False False False online False False 0.03000 False 658 WLD-USD WLD USD 0.00100 0.01000 WLD/USD 1 False False False False online False False 0.03000 False 394 XCN-USD XCN USD 0.00001 0.10000 XCN-USD 1 False False False False online False False 0.03000 False 230 XLM-USD XLM USD 0.00000 0.00000 XLM-USD 1 False False False False online False False 0.03000 False 550 XRP-USD XRP USD 0.00010 0.00000 XRP-USD 1 False False False False online False False 0.05000 False 290 XTZ-USD XTZ USD 0.00100 0.01000 XTZ-USD 1 False False False False online False False 0.03000 False 156 XYO-USD XYO USD 0.00001 0.10000 XYO-USD 1 False False False False online False False 0.03000 False 569 YFI-USD YFI USD 0.01000 0.00000 YFI-USD 1 False False False False online False False 0.03000 False 703 ZEC-USD ZEC USD 0.01000 0.00001 ZEC-USD 1 False False True False online False False 0.03000 False 301 ZEN-USD ZEN USD 0.00100 0.00100 ZEN-USD 1 False False False False online False False 0.03000 False 158 ZETA-USD ZETA USD 0.00010 0.10000 ZETA-USD 1 False False False False online False False 0.03000 False 256 ZETACHAIN-USD ZETACHAIN USD 0.00010 0.01000 ZETACHAIN/USD 1 False False False False online False False 0.03000 False 622 ZK-USD ZK USD 0.00001 0.10000 ZK/USD 1 False False False False online False False 0.03000 False 393 ZORA-USD ZORA USD 0.00001 1.00000 ZORA/USD 1 False False False False online False False 0.03000 False 505 ZRO-USD ZRO USD 0.00100 0.01000 ZRO/USD 1 False False False False online False False 0.03000 False 497 ZRX-USD ZRX USD 0.00000 0.00001 ZRX-USD 1 False False False False online False False 0.03000 False Coinbase Fetch Historical Candles This script pulls the historical candles:\n1 2 3 4 5 6 df = coinbase_fetch_historical_candles( product_id=\u0026#34;BTC-USD\u0026#34;, start=datetime(2025, 1, 1), end=datetime(2025, 1, 1), granularity=86_400, ) Specifically, the date/time, open, high, low, close, and volume levels:\ntime low high open close volume 0 2025-01-01 00:00:00 92743.63000 94960.91000 93347.59000 94383.59000 6871.73848 Coinbase Fetch Full History This script pulls the full history for a specified asset:\n1 2 3 4 5 6 df = coinbase_fetch_full_history( product_id=\u0026#34;BTC-USD\u0026#34;, start=datetime(2025, 1, 1), end=datetime(2025, 1, 31), granularity=86_400, ) The example above pulls the daily data for 1 month, but can handle data ranges of years because it uses the coinbase_fetch_historical_candles to pull 300 candles at a time to ensure that the API is not overloaded and drops data. Here\u0026rsquo;s the results for the above:\ntime low high open close volume 0 2025-01-01 00:00:00 92743.63000 94960.91000 93347.59000 94383.59000 6871.73848 1 2025-01-02 00:00:00 94177.00000 97776.99000 94383.59000 96903.19000 10912.47384 2 2025-01-03 00:00:00 96016.63000 98969.92000 96905.48000 98136.51000 9021.88538 3 2025-01-04 00:00:00 97516.65000 98761.02000 98139.85000 98209.85000 2742.08961 4 2025-01-05 00:00:00 97250.00000 98814.00000 98209.85000 98345.33000 2377.92176 5 2025-01-06 00:00:00 97900.00000 102500.00000 98347.65000 102279.41000 15173.55607 6 2025-01-07 00:00:00 96105.11000 102735.99000 102279.41000 96941.98000 16587.28692 7 2025-01-08 00:00:00 92500.00000 97254.35000 96941.98000 95036.63000 14182.29739 8 2025-01-09 00:00:00 91187.00000 95363.26000 95033.18000 92547.44000 9712.37853 9 2025-01-10 00:00:00 92209.25000 95862.92000 92547.44000 94701.18000 12634.03408 10 2025-01-11 00:00:00 93804.05000 94983.65000 94701.48000 94565.02000 2638.69957 11 2025-01-12 00:00:00 93670.30000 95383.84000 94569.91000 94509.62000 2025.81613 12 2025-01-13 00:00:00 89028.64000 95900.00000 94507.24000 94506.45000 13094.86359 13 2025-01-14 00:00:00 94311.36000 97353.29000 94507.35000 96534.96000 11210.74227 14 2025-01-15 00:00:00 96400.00000 100716.45000 96534.97000 100510.23000 13610.74729 15 2025-01-16 00:00:00 97277.58000 100880.00000 100504.27000 99981.78000 12312.37367 16 2025-01-17 00:00:00 99937.81000 105970.00000 99981.46000 104107.00000 20518.30949 17 2025-01-18 00:00:00 102233.45000 104933.15000 104107.00000 104435.00000 7835.29992 18 2025-01-19 00:00:00 99518.00000 106314.44000 104435.01000 101211.13000 13312.63686 19 2025-01-20 00:00:00 99416.27000 109358.01000 101217.78000 102145.43000 32342.18311 20 2025-01-21 00:00:00 100051.00000 107291.10000 102145.42000 106159.26000 19411.23489 21 2025-01-22 00:00:00 103100.00000 106431.34000 106159.27000 103667.11000 10730.01896 22 2025-01-23 00:00:00 101200.01000 106870.87000 103659.60000 103926.36000 25064.86500 23 2025-01-24 00:00:00 102751.92000 107200.00000 103926.36000 104850.27000 12921.99361 24 2025-01-25 00:00:00 104104.00000 105294.00000 104866.13000 104733.56000 3404.85308 25 2025-01-26 00:00:00 102452.24000 105478.80000 104729.92000 102563.00000 4575.36612 26 2025-01-27 00:00:00 97715.03000 103228.46000 102565.28000 102062.42000 23647.14112 27 2025-01-28 00:00:00 100213.80000 103770.85000 102063.92000 101290.00000 9488.53429 28 2025-01-29 00:00:00 101275.60000 104829.64000 101290.01000 103747.25000 11403.20279 29 2025-01-30 00:00:00 103289.74000 106484.77000 103747.25000 104742.64000 13061.34881 30 2025-01-31 00:00:00 101506.00000 106090.00000 104742.63000 102411.26000 13313.68104 Coinbase Pull Data This script combines the above functions to perform the following:\nAttempt to read an existing pickle data file If a data file exists, then pull updated data Otherwise, pull all historical data available for that asset on Coinbase Store pickle and/or excel files of the data in the specified directories Through the base_directory, source, and asset_class variables the script knows where in the local filesystem to look for an existing pickle file and the store the resulting updated pickle and/or excel files:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 df = coinbase_pull_data( base_directory=DATA_DIR, source=\u0026#34;Coinbase\u0026#34;, asset_class=\u0026#34;Cryptocurrencies\u0026#34;, excel_export=False, pickle_export=True, output_confirmation=True, base_currency=\u0026#34;BTC\u0026#34;, quote_currency=\u0026#34;USD\u0026#34;, granularity=60, # 60=minute, 3600=hourly, 86400=daily status=\u0026#39;online\u0026#39;, # default status is \u0026#39;online\u0026#39; start_date=datetime(current_year, current_month - 1, 1), # default start date end_date=datetime.now() - timedelta(days=1), # updates data through 1 day ago due to lag in data availability ) By passing None as the base_currency and/or the quote_currency, the script will use the coinbase_fetch_available_products function to pull the list of all the available products, and then pulls data for all assets in that list. This functionality is incredibly useful, and makes acquiring data very straightforward, especially for a set of products for a specific base_currency or quote_currency.\nThe example above pulls the data for BTC-USD, and stores it in the following system directory:\nBASE_DIR/DATA_DIR/Coinbase/Cryptocurrencies/Minute\nAnd here are a few screenshots of the filesystem:\nReferences https://www.coinbase.com/ Code The jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2025-07-06T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2025/07/06/data-pipelining-with-coinbase/data-pipelining-with-coinbase.png","permalink":"https://www.jaredszajkowski.com/stack/2025/07/06/data-pipelining-with-coinbase/","title":"Data Pipelining With Coinbase"},{"content":"Introduction In this post, I\u0026rsquo;ll cover the implementation of doit to automate the execution of Jupyter notebook files, Python scripts, and building the Hugo static site. Many of the concepts covered below were introduced recently in FINM 32900 - Full-Stack Quantitative Finance. This course emphasized the \u0026ldquo;full stack\u0026rdquo; approach, including the following:\nUse of GitHub Virtual environments Environment variables Use of various data sources (particularly WRDS) Processing/cleaning data GitHub actions Publishing data Restricting access to GitHub hosted sites Motivation The primary motivation for automation came from several realizations:\nSetting directory variables would avoid any issues with managing where the static files were stored locally I wanted to be able to pull updated data, execute Jupyter notebooks, and update the posts within my Hugo site without a manual intervention and processes I like to include the html and PDF exports of the Jupyter notebooks, which required copying the exports to the \u0026ldquo;Public\u0026rdquo; folder of the website I needed a system to build the \u0026ldquo;index.md\u0026rdquo; files that are present in each post directory, and automatically include Python code and functions (again, without copying/pasting or manual processes) dodo.py The dodo.py file in the primary directory is referenced by doit and includes all imports, functions, environment variables, etc. as required to execute the desired code. My dodo.py is broken down as follows:\nImports The inital imports are as follows:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 ####################################### ## Import Libraries ####################################### import sys ## Make sure the src folder is in the path sys.path.insert(1, \u0026#34;./src/\u0026#34;) import re import shutil import subprocess import time import yaml from colorama import Fore, Style, init from datetime import datetime from os import environ, getcwd, path from pathlib import Path This first adds the /src/ subdirectory to the path (required later on), and then imports any other required modules. I prefer to sort all imports alphabetically for easy reference and readability.\nPrint PyDoit Text in Green Next, I use the following code to help differentiate the various outputs in the termial when executing doit:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 # Code from lines 29-75 referenced from the UChicago # FINM 32900 - Full-Stack Quantitative Finance course # Credit to Jeremy Bejarano # https://github.com/jmbejara ## Custom reporter: Print PyDoit Text in Green # This is helpful because some tasks write to sterr and pollute the output in # the console. I don\u0026#39;t want to mute this output, because this can sometimes # cause issues when, for example, LaTeX hangs on an error and requires # presses on the keyboard before continuing. However, I want to be able # to easily see the task lines printed by PyDoit. I want them to stand out # from among all the other lines printed to the console. from doit.reporter import ConsoleReporter from settings import config ####################################### ## Slurm Configuration ####################################### try: in_slurm = environ[\u0026#34;SLURM_JOB_ID\u0026#34;] is not None except: in_slurm = False class GreenReporter(ConsoleReporter): def write(self, stuff, **kwargs): doit_mark = stuff.split(\u0026#34; \u0026#34;)[0].ljust(2) task = \u0026#34; \u0026#34;.join(stuff.split(\u0026#34; \u0026#34;)[1:]).strip() + \u0026#34; \u0026#34; output = ( Fore.GREEN + doit_mark + f\u0026#34; {path.basename(getcwd())}: \u0026#34; + task + Style.RESET_ALL ) self.outstream.write(output) if not in_slurm: DOIT_CONFIG = { \u0026#34;reporter\u0026#34;: GreenReporter, # other config here... # \u0026#34;cleanforget\u0026#34;: True, # Doit will forget about tasks that have been cleaned. \u0026#34;backend\u0026#34;: \u0026#34;sqlite3\u0026#34;, \u0026#34;dep_file\u0026#34;: \u0026#34;./.doit-db.sqlite\u0026#34;, } else: DOIT_CONFIG = { \u0026#34;backend\u0026#34;: \u0026#34;sqlite3\u0026#34;, \u0026#34;dep_file\u0026#34;: \u0026#34;./.doit-db.sqlite\u0026#34; } init(autoreset=True) Set Directory Variables Next, I establish the variables that reference some of the more important directories and subdirectories in the project:\n1 2 3 4 5 6 7 8 9 10 11 12 ####################################### ## Set directory variables ####################################### BASE_DIR = config(\u0026#34;BASE_DIR\u0026#34;) CONTENT_DIR = config(\u0026#34;CONTENT_DIR\u0026#34;) POSTS_DIR = config(\u0026#34;POSTS_DIR\u0026#34;) PAGES_DIR = config(\u0026#34;PAGES_DIR\u0026#34;) PUBLIC_DIR = config(\u0026#34;PUBLIC_DIR\u0026#34;) SOURCE_DIR = config(\u0026#34;SOURCE_DIR\u0026#34;) DATA_DIR = config(\u0026#34;DATA_DIR\u0026#34;) DATA_MANUAL_DIR = config(\u0026#34;DATA_MANUAL_DIR\u0026#34;) These directory variables are set from the settings.py file in the /src/ directory. Setting these directory variables allows me to reference them at any point later on in the dodo.py file.\nHelper Functions The following are several helper functions that are referenced in the tasks. These are somewhat self explanatory, and are used by the task functions in the next section below:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 ####################################### ## Helper functions ####################################### def copy_file(origin_path, destination_path, mkdir=True): \u0026#34;\u0026#34;\u0026#34;Create a Python action for copying a file.\u0026#34;\u0026#34;\u0026#34; def _copy_file(): origin = Path(origin_path) dest = Path(destination_path) if mkdir: dest.parent.mkdir(parents=True, exist_ok=True) shutil.copy2(origin, dest) return _copy_file def extract_front_matter(index_path): \u0026#34;\u0026#34;\u0026#34;Extract front matter as a dict from a Hugo index.md file.\u0026#34;\u0026#34;\u0026#34; text = index_path.read_text() match = re.search(r\u0026#34;(?s)^---(.*?)---\u0026#34;, text) if match: return yaml.safe_load(match.group(1)) return {} def notebook_source_hash(notebook_path): \u0026#34;\u0026#34;\u0026#34;Compute a SHA-256 hash of the notebook\u0026#39;s code and markdown cells. This includes all whitespace and comments.\u0026#34;\u0026#34;\u0026#34; import nbformat import hashlib with open(notebook_path, \u0026#34;r\u0026#34;, encoding=\u0026#34;utf-8\u0026#34;) as f: nb = nbformat.read(f, as_version=4) relevant_cells = [ cell[\u0026#34;source\u0026#34;] for cell in nb.cells if cell.cell_type in {\u0026#34;code\u0026#34;, \u0026#34;markdown\u0026#34;} ] full_content = \u0026#34;\\n\u0026#34;.join(relevant_cells) return hashlib.sha256(full_content.encode(\u0026#34;utf-8\u0026#34;)).hexdigest() def clean_pdf_export_pngs(subdir, notebook_name): \u0026#34;\u0026#34;\u0026#34;Remove .png files created by nbconvert during PDF export.\u0026#34;\u0026#34;\u0026#34; pattern = f\u0026#34;{notebook_name}_*_*.png\u0026#34; deleted = False for file in subdir.glob(pattern): print(f\u0026#34;🧹 Removing nbconvert temp image: {file}\u0026#34;) file.unlink() deleted = True if not deleted: print(f\u0026#34;✅ No temp PNGs to remove for {notebook_name}\u0026#34;) Tasks Next, we will look at the individual tasks that are being executed by doit.\nThe config task creates the base directories for the Hugo site:\n1 2 3 4 5 6 7 8 9 10 11 12 13 ####################################### ## PyDoit tasks ####################################### def task_config(): \u0026#34;\u0026#34;\u0026#34;Create empty directories for content, page, post, and public if they don\u0026#39;t exist\u0026#34;\u0026#34;\u0026#34; return { \u0026#34;actions\u0026#34;: [\u0026#34;ipython ./src/settings.py\u0026#34;], \u0026#34;file_dep\u0026#34;: [\u0026#34;./src/settings.py\u0026#34;], \u0026#34;targets\u0026#34;: [CONTENT_DIR, PAGES_DIR, POSTS_DIR, PUBLIC_DIR], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } The task_list_posts_subdirs function is not really necessary, but was used as an initial starting point for when I began building the dodo.py file:\n1 2 3 4 5 6 7 8 9 def task_list_posts_subdirs(): \u0026#34;\u0026#34;\u0026#34;Create a list of the subdirectories of the posts directory\u0026#34;\u0026#34;\u0026#34; return { \u0026#34;actions\u0026#34;: [\u0026#34;python ./src/list_posts_subdirs.py\u0026#34;], \u0026#34;file_dep\u0026#34;: [\u0026#34;./src/settings.py\u0026#34;], # \u0026#34;targets\u0026#34;: [POSTS_DIR], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } The task_run_post_notebooks function performs the following actions:\nFinds all of the \u0026ldquo;post\u0026rdquo; subdirectories In each \u0026ldquo;post\u0026rdquo; directory, it executes the jupyter notebook file (if found) that has the same name as the post The hash of the non-markdown cells in the notebook is also checked, and if the hash has not changed since the last run, then it skips executing the notebook After the notebook is executed (or not), the log is updated with the date and action 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 def task_run_post_notebooks(): \u0026#34;\u0026#34;\u0026#34;Execute notebooks that match their subdirectory names and only when code or markdown content has changed\u0026#34;\u0026#34;\u0026#34; for subdir in POSTS_DIR.iterdir(): if not subdir.is_dir(): continue notebook_path = subdir / f\u0026#34;{subdir.name}.ipynb\u0026#34; if not notebook_path.exists(): continue # ✅ Skip subdirs with no matching notebook hash_file = subdir / f\u0026#34;{subdir.name}.last_source_hash\u0026#34; log_file = subdir / f\u0026#34;{subdir.name}.log\u0026#34; def source_has_changed(path=notebook_path, hash_path=hash_file, log_path=log_file): current_hash = notebook_source_hash(path) timestamp = datetime.now().strftime(\u0026#34;%Y-%m-%d %H:%M:%S\u0026#34;) if hash_path.exists(): old_hash = hash_path.read_text().strip() if current_hash != old_hash: print(f\u0026#34;🔁 Change detected in {path.name}\u0026#34;) return False # needs re-run # ✅ No change → log as skipped with log_path.open(\u0026#34;a\u0026#34;) as log: log.write(f\u0026#34;[{timestamp}] ⏩ Skipped (no changes): {path.name}\\n\u0026#34;) print(f\u0026#34;⏩ No change in hash for {path.name}\u0026#34;) return True # 🆕 No previous hash → must run print(f\u0026#34;🆕 No previous hash found for {path.name}\u0026#34;) return False def run_and_log(path=notebook_path, hash_path=hash_file, log_path=log_file): start_time = time.time() subprocess.run([ \u0026#34;jupyter\u0026#34;, \u0026#34;nbconvert\u0026#34;, \u0026#34;--execute\u0026#34;, \u0026#34;--to\u0026#34;, \u0026#34;notebook\u0026#34;, \u0026#34;--inplace\u0026#34;, \u0026#34;--log-level=ERROR\u0026#34;, str(path) ], check=True) elapsed = round(time.time() - start_time, 2) new_hash = notebook_source_hash(path) hash_path.write_text(new_hash) print(f\u0026#34;✅ Saved new hash for {path.name}\u0026#34;) timestamp = datetime.now().strftime(\u0026#34;%Y-%m-%d %H:%M:%S\u0026#34;) log_msg = f\u0026#34;[{timestamp}] ✅ Executed {path.name} in {elapsed}s\\n\u0026#34; with log_path.open(\u0026#34;a\u0026#34;) as f: f.write(log_msg) print(log_msg.strip()) yield { \u0026#34;name\u0026#34;: subdir.name, \u0026#34;actions\u0026#34;: [run_and_log], \u0026#34;file_dep\u0026#34;: [notebook_path], \u0026#34;uptodate\u0026#34;: [source_has_changed], \u0026#34;verbosity\u0026#34;: 2, } Next, the task_export_post_notebooks function exports the executed jupyter notebook to both HTML and PDF formats.\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 def task_export_post_notebooks(): \u0026#34;\u0026#34;\u0026#34;Export executed notebooks to HTML and PDF, and clean temp PNGs\u0026#34;\u0026#34;\u0026#34; for subdir in POSTS_DIR.iterdir(): if not subdir.is_dir(): continue notebook_name = subdir.name notebook_path = subdir / f\u0026#34;{notebook_name}.ipynb\u0026#34; html_output = subdir / f\u0026#34;{notebook_name}.html\u0026#34; pdf_output = subdir / f\u0026#34;{notebook_name}.pdf\u0026#34; if not notebook_path.exists(): continue yield { \u0026#34;name\u0026#34;: notebook_name, \u0026#34;actions\u0026#34;: [ f\u0026#34;jupyter nbconvert --to=html --log-level=WARN --output={html_output} {notebook_path}\u0026#34;, f\u0026#34;jupyter nbconvert --to=pdf --log-level=WARN --output={pdf_output} {notebook_path}\u0026#34;, (clean_pdf_export_pngs, [subdir, notebook_name]) ], \u0026#34;file_dep\u0026#34;: [notebook_path], \u0026#34;targets\u0026#34;: [html_output, pdf_output], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } The task_build_post_indices builds each index.md file within each \u0026ldquo;post\u0026rdquo; directory. It looks for an index_temp.md and an index_dep.txt file, which contains the dependencies required to build the index.md file. The dependencies are established within the jupyter notebook for each post, and the index_dep.txt file is also updated when the notebook is executed.\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 def task_build_post_indices(): \u0026#34;\u0026#34;\u0026#34;Run build_index.py in each post subdirectory to generate index.md\u0026#34;\u0026#34;\u0026#34; script_path = SOURCE_DIR / \u0026#34;build_index.py\u0026#34; for subdir in POSTS_DIR.iterdir(): if subdir.is_dir() and (subdir / \u0026#34;index_temp.md\u0026#34;).exists(): def run_script(subdir=subdir): subprocess.run( [\u0026#34;python\u0026#34;, str(script_path)], cwd=subdir, check=True ) yield { \u0026#34;name\u0026#34;: subdir.name, \u0026#34;actions\u0026#34;: [run_script], \u0026#34;file_dep\u0026#34;: [ subdir / \u0026#34;index_temp.md\u0026#34;, subdir / \u0026#34;index_dep.txt\u0026#34;, script_path, ], \u0026#34;targets\u0026#34;: [subdir / \u0026#34;index.md\u0026#34;], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } Here\u0026rsquo;s an example of when the index_dep.txt file is generated, and then updated with a markdown file dependency is generated with the export_track_md_deps function:\n1 2 3 4 5 6 7 8 9 10 11 12 # Create file to track markdown dependencies dep_file = Path(\u0026#34;index_dep.txt\u0026#34;) dep_file.write_text(\u0026#34;\u0026#34;) # Copy this \u0026lt;!-- INSERT_01_VIX_Stats_By_Year_HERE --\u0026gt; to index_temp.md export_track_md_deps(dep_file=dep_file, md_filename=\u0026#34;01_VIX_Stats_By_Year.md\u0026#34;, content=vix_stats_by_year.to_markdown(floatfmt=\u0026#34;.2f\u0026#34;)) # Copy this \u0026lt;!-- INSERT_02_VVIX_DF_Info_HERE --\u0026gt; to index_temp.md export_track_md_deps(dep_file=dep_file, md_filename=\u0026#34;02_VVIX_DF_Info.md\u0026#34;, content=df_info_markdown(vix)) # Copy this \u0026lt;!-- INSERT_11_Net_Profit_Percent_HERE --\u0026gt; to index_temp.md export_track_md_deps(dep_file=dep_file, md_filename=\u0026#34;11_Net_Profit_Percent.md\u0026#34;, content=net_profit_percent_str) Moving on, the task_clean_public removes the public directory within the static site. This is necessary to clean out any erroneous files or directories that are changed when the site is rebuilt.\n1 2 3 4 5 6 7 8 9 10 11 12 13 def task_clean_public(): \u0026#34;\u0026#34;\u0026#34;Remove the Hugo public directory before rebuilding the site.\u0026#34;\u0026#34;\u0026#34; def remove_public(): if PUBLIC_DIR.exists(): shutil.rmtree(PUBLIC_DIR) print(f\u0026#34;🧹 Deleted {PUBLIC_DIR}\u0026#34;) else: print(f\u0026#34;ℹ️ {PUBLIC_DIR} does not exist, nothing to delete.\u0026#34;) return { \u0026#34;actions\u0026#34;: [remove_public], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } The task_build_site builds the Hugo static site.\n1 2 3 4 5 6 7 8 def task_build_site(): \u0026#34;\u0026#34;\u0026#34;Build the Hugo static site\u0026#34;\u0026#34;\u0026#34; return { \u0026#34;actions\u0026#34;: [\u0026#34;hugo\u0026#34;], \u0026#34;task_dep\u0026#34;: [\u0026#34;clean_public\u0026#34;], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } The task_copy_notebook_exports copies the HTML and PDF exports generated above to the public folder. This is necessary due to how Hugo handles HTML and PDF files and excludes those when generating the static site public directories and files.\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 def task_copy_notebook_exports(): \u0026#34;\u0026#34;\u0026#34;Copy notebook HTML exports into the correct Hugo public/ date-based folders\u0026#34;\u0026#34;\u0026#34; for subdir in POSTS_DIR.iterdir(): if subdir.is_dir(): html_file = subdir / f\u0026#34;{subdir.name}.html\u0026#34; index_md = subdir / \u0026#34;index.md\u0026#34; if not html_file.exists() or not index_md.exists(): continue # Extract slug and date from front matter front_matter = extract_front_matter(index_md) slug = front_matter.get(\u0026#34;slug\u0026#34;, subdir.name) date_str = front_matter.get(\u0026#34;date\u0026#34;) if not date_str: continue # Format path like: public/YYYY/MM/DD/slug/ date_obj = datetime.fromisoformat(date_str) public_path = PUBLIC_DIR / f\u0026#34;{date_obj:%Y/%m/%d}\u0026#34; / slug target_path = public_path / f\u0026#34;{slug}.html\u0026#34; def copy_html(src=html_file, dest=target_path): dest.parent.mkdir(parents=True, exist_ok=True) shutil.copy2(src, dest) print(f\u0026#34;✅ Copied {src} → {dest}\u0026#34;) yield { \u0026#34;name\u0026#34;: subdir.name, \u0026#34;actions\u0026#34;: [copy_html], \u0026#34;file_dep\u0026#34;: [html_file, index_md], \u0026#34;targets\u0026#34;: [target_path], \u0026#34;task_dep\u0026#34;: [\u0026#34;build_site\u0026#34;], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } The task_create_schwab_callback creates a simple HTML file that will read the authorization code when using oauth with the Schwab API.\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 def task_create_schwab_callback(): \u0026#34;\u0026#34;\u0026#34;Create a Schwab callback URL by creating /public/schwab_callback/index.html and placing the html code in it\u0026#34;\u0026#34;\u0026#34; def create_callback(): callback_path = PUBLIC_DIR / \u0026#34;schwab_callback\u0026#34; / \u0026#34;index.html\u0026#34; callback_path.parent.mkdir(parents=True, exist_ok=True) html = \u0026#34;\u0026#34;\u0026#34;\u0026lt;!DOCTYPE html\u0026gt; \u0026lt;html lang=\u0026#34;en\u0026#34;\u0026gt; \u0026lt;head\u0026gt; \u0026lt;meta charset=\u0026#34;UTF-8\u0026#34; /\u0026gt; \u0026lt;title\u0026gt;Schwab OAuth Code\u0026lt;/title\u0026gt; \u0026lt;script\u0026gt; const params = new URLSearchParams(window.location.search); const code = params.get(\u0026#34;code\u0026#34;); document.write(\u0026#34;\u0026lt;h1\u0026gt;Authorization Code:\u0026lt;/h1\u0026gt;\u0026lt;p\u0026gt;\u0026#34; + code + \u0026#34;\u0026lt;/p\u0026gt;\u0026#34;); \u0026lt;/script\u0026gt; \u0026lt;/head\u0026gt; \u0026lt;body\u0026gt;\u0026lt;/body\u0026gt; \u0026lt;/html\u0026gt;\u0026#34;\u0026#34;\u0026#34; with open(callback_path, \u0026#34;w\u0026#34;) as f: f.write(html) print(f\u0026#34;✅ Created Schwab callback page at {callback_path}\u0026#34;) return { \u0026#34;actions\u0026#34;: [create_callback], \u0026#34;task_dep\u0026#34;: [\u0026#34;copy_notebook_exports\u0026#34;, \u0026#34;clean_public\u0026#34;], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } Finally, the task_deploy_site adds any new files, commits the changes, prompts for a message, and pushes the updates to GitHub.\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 def task_deploy_site(): \u0026#34;\u0026#34;\u0026#34;Prompt for a commit message and push to GitHub\u0026#34;\u0026#34;\u0026#34; def commit_and_push(): message = input(\u0026#34;What is the commit message? \u0026#34;) if not message.strip(): print(\u0026#34;❌ Commit message cannot be empty.\u0026#34;) return 1 # signal failure import subprocess subprocess.run([\u0026#34;git\u0026#34;, \u0026#34;add\u0026#34;, \u0026#34;.\u0026#34;], check=True) subprocess.run([\u0026#34;git\u0026#34;, \u0026#34;commit\u0026#34;, \u0026#34;-am\u0026#34;, message], check=True) subprocess.run([\u0026#34;git\u0026#34;, \u0026#34;push\u0026#34;], check=True) print(\u0026#34;✅ Pushed to GitHub.\u0026#34;) return { \u0026#34;actions\u0026#34;: [commit_and_push], \u0026#34;task_dep\u0026#34;: [\u0026#34;create_schwab_callback\u0026#34;], \u0026#34;verbosity\u0026#34;: 2, \u0026#34;clean\u0026#34;: [], # Don\u0026#39;t clean these files by default. } As (likely) expected, a good portion of the above code was generated by ChatGPT - somewhere ~ 50%-75%. The balance was generated by myself or modified using the base code provided. Importantly, the general idea of automating the entire process within Hugo and processing the post subdirectories is original (as far as I know).\nFinally, the complete dodo.py and settings.py files are available in the jupyter notebook / HTML / PDF exports linked below.\nExecuting doit To execute doit, simply run:\n$ doit in the terminal after changing to the high level directory.\nAlternatively, you can list the individual tasks with:\n$ doit list and then execute individually, such as:\n$ doit build_post_indices And finally, doit can be forced to execute all tasks with:\n$ doit --always or an individual task with:\n$ doit --always build_post_indices References https://pydoit.org/ https://github.com/jmbejara Code The jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2025-06-29T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2025/06/29/automating-execution-jupyter-notebook-files-python-scripts-hugo-static-site-generation/automating-execution-jupyter-notebook-files-python-scripts-hugo-static-site-generation_final.jpg","permalink":"https://www.jaredszajkowski.com/stack/2025/06/29/automating-execution-jupyter-notebook-files-python-scripts-hugo-static-site-generation/","title":"Automating Execution of Jupyter Notebook Files, Python Scripts, and Hugo Static Site Generation"},{"content":"Trading History I have recently started trading based on the ideas from part 2, using the following as guidelines:\nLook for opportunities when the VIX is the 7 - 10th deciles to buy puts, opening positions during the VIX spikes and closing them as volatility comes back down Look for opportunities when the VIX is in the 1 - 4th deciles to buy calls during periods of lower volatility based on seasonality When volatility spikes, sell call spreads and buy puts When volatility is low, buy calls Open positions with 2, 3, and 4 months to expiration Close positions within 4 - 5 weeks of expiration to avoid theta drag The executed trades, closed positions, and open positions listed below are all automated updates from the transaction history exports from Schwab. The exported CSV files are available in the GitHub repository.\nTrades Executed Here are the trades executed to date, with any comments related to execution, market sentiment, reason for opening/closing position, VIX level, etc.\nTrade_Date Action Symbol Quantity Price Fees \u0026amp; Comm Amount Approx_VIX_Level Comments 2024-08-05 00:00:00 Buy to Open VIX 09/18/2024 34.00 P 1 10.95 1.08 1096.08 34.33 nan 2024-08-21 00:00:00 Sell to Close VIX 09/18/2024 34.00 P 1 17.95 1.08 1793.92 16.50 nan 2024-08-05 00:00:00 Buy to Open VIX 10/16/2024 40.00 P 1 16.35 1.08 1636.08 42.71 nan 2024-09-18 00:00:00 Sell to Close VIX 10/16/2024 40.00 P 1 21.54 1.08 2152.92 18.85 nan 2024-08-07 00:00:00 Buy to Open VIX 11/20/2024 25.00 P 2 5.90 2.16 1182.16 27.11 nan 2024-11-04 00:00:00 Sell to Close VIX 11/20/2024 25.00 P 2 6.10 2.16 1217.84 22.43 nan 2024-08-06 00:00:00 Buy to Open VIX 12/18/2024 30.00 P 1 10.25 1.08 1026.08 32.27 nan 2024-11-27 00:00:00 Sell to Close VIX 12/18/2024 30.00 P 1 14.95 1.08 1493.92 14.04 nan 2025-03-04 00:00:00 Buy to Open VIX 04/16/2025 25.00 P 5 5.65 5.40 2830.40 25.75 nan 2025-03-24 00:00:00 Sell to Close VIX 04/16/2025 25.00 P 5 7.00 5.40 3494.60 18.01 nan 2025-03-10 00:00:00 Buy to Open VIX 05/21/2025 26.00 P 5 7.10 5.40 3555.40 27.54 Missed opportunity to close position for 20% profit before vol spike in early April 2025 2025-04-04 00:00:00 Buy to Open VIX 05/21/2025 26.00 P 10 4.10 10.81 4110.81 38.88 Averaged down on existing position 2025-04-24 00:00:00 Sell to Close VIX 05/21/2025 26.00 P 7 3.50 7.57 2442.43 27.37 Sold half of position due to vol spike concerns and theta 2025-05-02 00:00:00 Sell to Close VIX 05/21/2025 26.00 P 4 4.35 4.32 1735.68 22.73 Sold half of remaining position due to vol spike concerns and theta 2025-05-07 00:00:00 Sell to Close VIX 05/21/2025 26.00 P 4 3.55 4.32 1415.68 24.49 Closed position ahead of Fed’s (Powell’s) comments 2025-04-04 00:00:00 Buy to Open VIX 05/21/2025 37.00 P 3 13.20 3.24 3963.24 36.46 nan 2025-05-07 00:00:00 Sell to Close VIX 05/21/2025 37.00 P 3 13.75 3.24 4121.76 24.51 Closed position ahead of Fed’s (Powell’s) comments 2025-04-08 00:00:00 Buy to Open VIX 05/21/2025 50.00 P 2 21.15 2.16 4232.16 nan nan 2025-04-24 00:00:00 Sell to Close VIX 05/21/2025 50.00 P 1 25.30 1.08 2528.92 nan nan 2025-04-25 00:00:00 Sell to Close VIX 05/21/2025 50.00 P 1 25.65 1.08 2563.92 nan nan 2025-04-03 00:00:00 Buy to Open VIX 06/18/2025 27.00 P 8 7.05 8.65 5648.65 27.62 nan 2025-04-08 00:00:00 Buy to Open VIX 06/18/2025 27.00 P 4 4.55 4.32 1824.32 55.44 Averaged down on existing position 2025-05-12 00:00:00 Sell to Close VIX 06/18/2025 27.00 P 6 7.55 6.49 4523.51 19.05 Market up on positive news of lowering tariffs with China; VIX down 15%, VVIX down 10% 2025-05-12 00:00:00 Sell to Close VIX 06/18/2025 27.00 P 6 7.40 6.49 4433.51 19.47 Market up on positive news of lowering tariffs with China; VIX down 15%, VVIX down 10% 2025-04-04 00:00:00 Buy to Open VIX 06/18/2025 36.00 P 3 13.40 3.24 4023.24 36.61 nan 2025-05-12 00:00:00 Sell to Close VIX 06/18/2025 36.00 P 3 16.00 3.24 4796.76 19.14 Market up on positive news of lowering tariffs with China; VIX down 15%, VVIX down 10% 2025-04-07 00:00:00 Buy to Open VIX 06/18/2025 45.00 P 2 18.85 2.16 3772.16 53.65 nan 2025-05-12 00:00:00 Sell to Close VIX 06/18/2025 45.00 P 2 25.00 2.16 4997.84 19.24 Market up on positive news of lowering tariffs with China; VIX down 15%, VVIX down 10% 2025-04-03 00:00:00 Buy to Open VIX 07/16/2025 29.00 P 5 8.55 5.40 4280.40 29.03 nan 2025-05-13 00:00:00 Sell to Close VIX 07/16/2025 29.00 P 3 10.40 3.24 3116.76 17.72 nan 2025-05-13 00:00:00 Sell to Close VIX 07/16/2025 29.00 P 2 10.30 2.16 2057.84 17.68 nan 2025-04-04 00:00:00 Buy to Open VIX 07/16/2025 36.00 P 3 13.80 3.24 4143.24 36.95 nan 2025-05-13 00:00:00 Sell to Close VIX 07/16/2025 36.00 P 1 17.00 1.08 1698.92 17.79 nan 2025-05-13 00:00:00 Sell to Close VIX 07/16/2025 36.00 P 2 16.90 2.16 3377.84 17.72 nan 2025-04-07 00:00:00 Buy to Open VIX 07/16/2025 45.00 P 2 21.55 2.16 4312.16 46.17 nan 2025-05-13 00:00:00 Sell to Close VIX 07/16/2025 45.00 P 2 25.65 2.16 5127.84 17.96 nan 2025-04-07 00:00:00 Buy to Open VIX 08/20/2025 45.00 P 2 21.75 2.16 4352.16 49.07 nan 2025-05-13 00:00:00 Sell to Close VIX 08/20/2025 45.00 P 2 25.40 2.16 5077.84 18.06 nan 2025-06-26 00:00:00 Buy to Open VIX 09/17/2025 20.00 C 10 3.00 10.81 3010.81 16.37 Opened long dated call position; VIX level at 4th historical decile 2025-08-01 00:00:00 Sell to Close VIX 09/17/2025 20.00 C 5 3.05 5.40 1519.60 20.48 Sold half of position due to theta drag, held remaining half for vol spike 2025-08-12 00:00:00 Buy to Open VIX 09/17/2025 20.00 C 10 1.54 9.31 1549.31 14.87 Doubled existing position to reduce cost basis 2025-08-22 00:00:00 Buy to Open VIX 09/17/2025 20.00 C 15 0.88 12.01 1332.01 nan nan 2025-08-29 00:00:00 Sell to Close VIX 09/17/2025 20.00 C 15 0.80 12.01 1187.99 nan nan 2025-09-02 00:00:00 Sell to Close VIX 09/17/2025 20.00 C 15 1.33 13.96 1981.04 nan nan 2025-08-12 00:00:00 Buy to Open VIX 10/22/2025 19.00 C 10 3.05 10.81 3060.81 15.22 nan 2025-09-29 00:00:00 Sell to Close VIX 10/22/2025 19.00 C 10 1.20 9.31 1190.69 nan nan 2025-08-22 00:00:00 Buy to Open VIX 10/22/2025 20.00 C 10 2.12 10.81 2130.81 nan nan 2025-09-02 00:00:00 Sell to Close VIX 10/22/2025 20.00 C 10 2.65 10.81 2639.19 nan nan 2025-07-23 00:00:00 Buy to Open VIX 10/22/2025 21.00 C 10 2.92 10.81 2930.81 15.40 Continued low volatility, opened long dated call position; VIX level at 4th historical decile 2025-09-29 00:00:00 Sell to Close VIX 10/22/2025 21.00 C 10 0.91 8.01 901.99 nan nan 2025-06-26 00:00:00 Buy to Open VIX 10/22/2025 22.00 C 10 2.94 10.81 2950.81 16.43 Opened long dated call position; VIX level at 4th historical decile 2025-09-15 00:00:00 Sell to Close VIX 10/22/2025 22.00 C 10 1.08 9.31 1070.69 nan nan 2025-07-17 00:00:00 Buy to Open VIX 10/22/2025 23.00 C 10 2.75 10.81 2760.81 16.86 Continued low volatility, opened long dated call position; VIX level at 4th historical decile 2025-09-15 00:00:00 Sell to Close VIX 10/22/2025 23.00 C 10 0.98 8.01 971.99 nan nan 2025-09-29 00:00:00 Buy to Open VIX 11/19/2025 16.00 C 10 3.65 10.81 3660.81 nan nan 2025-10-10 00:00:00 Sell to Close VIX 11/19/2025 16.00 C 10 4.80 10.81 4789.19 nan nan 2025-08-22 00:00:00 Buy to Open VIX 11/19/2025 19.00 C 10 3.17 10.81 3180.81 nan nan 2025-09-02 00:00:00 Sell to Close VIX 11/19/2025 19.00 C 10 3.70 10.81 3689.19 nan nan 2025-08-13 00:00:00 Buy to Open VIX 11/19/2025 20.00 C 10 3.26 10.81 3270.81 14.56 VIX at ~0 decile based on the YTD VIX data 2025-10-09 00:00:00 Sell to Close VIX 11/19/2025 20.00 C 10 2.08 10.81 2069.19 nan nan 2025-08-12 00:00:00 Buy to Open VIX 11/19/2025 21.00 C 10 3.00 10.81 3010.81 15.17 nan 2025-10-08 00:00:00 Sell to Close VIX 11/19/2025 21.00 C 10 1.83 9.31 1820.69 nan nan 2025-09-11 00:00:00 Buy to Open VIX 12/17/2025 17.00 C 10 3.90 10.81 3910.81 nan nan 2025-10-10 00:00:00 Sell to Close VIX 12/17/2025 17.00 C 10 4.60 10.81 4589.19 nan nan Volatility In August 2024 Plot with VIX high/low, trade side, VIX option, and VIX level at trade date/time:\nClosed positions:\nSymbol Amount_Buy Quantity_Buy Amount_Sell Quantity_Sell Realized_PnL Percent_PnL VIX 09/18/2024 34.00 P 1096.08 1 1793.92 1 697.84 0.64 VIX 10/16/2024 40.00 P 1636.08 1 2152.92 1 516.84 0.32 VIX 11/20/2024 25.00 P 1182.16 2 1217.84 2 35.68 0.03 VIX 12/18/2024 30.00 P 1026.08 1 1493.92 1 467.84 0.46 Open positions:\nSymbol Amount_Buy Quantity_Buy Total Opened Position Market Value: $4,940.40 Total Closed Position Market Value: $6,658.60 Net Profit/Loss: $1,718.20 Percent Profit/Loss: 34.78%\nVolatility In March 2025 Plot with VIX high/low, trade side, VIX option, and VIX level at trade date/time:\nClosed positions:\nSymbol Amount_Buy Quantity_Buy Amount_Sell Quantity_Sell Realized_PnL Percent_PnL VIX 04/16/2025 25.00 P 2830.40 5 3494.60 5 664.20 0.23 Open positions:\nSymbol Amount_Buy Quantity_Buy Total Opened Position Market Value: $2,830.40 Total Closed Position Market Value: $3,494.60 Net Profit/Loss: $664.20 Percent Profit/Loss: 23.47%\nVolatility In April 2025 Plot with VIX high/low, trade side, VIX option, and VIX level at trade date/time:\nClosed positions:\nSymbol Amount_Buy Quantity_Buy Amount_Sell Quantity_Sell Realized_PnL Percent_PnL VIX 05/21/2025 26.00 P 7666.21 15 5593.79 15 -2072.42 -0.27 VIX 05/21/2025 37.00 P 3963.24 3 4121.76 3 158.52 0.04 VIX 05/21/2025 50.00 P 4232.16 2 5092.84 2 860.68 0.20 VIX 06/18/2025 27.00 P 7472.97 12 8957.02 12 1484.05 0.20 VIX 06/18/2025 36.00 P 4023.24 3 4796.76 3 773.52 0.19 VIX 06/18/2025 45.00 P 3772.16 2 4997.84 2 1225.68 0.32 VIX 07/16/2025 29.00 P 4280.40 5 5174.60 5 894.20 0.21 VIX 07/16/2025 36.00 P 4143.24 3 5076.76 3 933.52 0.23 VIX 07/16/2025 45.00 P 4312.16 2 5127.84 2 815.68 0.19 VIX 08/20/2025 45.00 P 4352.16 2 5077.84 2 725.68 0.17 Open positions:\nSymbol Amount_Buy Quantity_Buy Total Opened Position Market Value: $48,217.94 Total Closed Position Market Value: $54,017.05 Net Profit/Loss: $5,799.11 Percent Profit/Loss: 12.03%\nLow Volatility In June, July, August 2025 Plot with VIX high/low, trade side, VIX option, and VIX level at trade date/time:\nClosed positions:\nSymbol Amount_Buy Quantity_Buy Amount_Sell Quantity_Sell Realized_PnL Percent_PnL VIX 09/17/2025 20.00 C 5892.13 35 4688.63 35 -1203.50 -0.20 VIX 10/22/2025 19.00 C 3060.81 10 1190.69 10 -1870.12 -0.61 VIX 10/22/2025 20.00 C 2130.81 10 2639.19 10 508.38 0.24 VIX 10/22/2025 21.00 C 2930.81 10 901.99 10 -2028.82 -0.69 VIX 10/22/2025 22.00 C 2950.81 10 1070.69 10 -1880.12 -0.64 VIX 10/22/2025 23.00 C 2760.81 10 971.99 10 -1788.82 -0.65 VIX 11/19/2025 16.00 C 3660.81 10 4789.19 10 1128.38 0.31 VIX 11/19/2025 19.00 C 3180.81 10 3689.19 10 508.38 0.16 VIX 11/19/2025 20.00 C 3270.81 10 2069.19 10 -1201.62 -0.37 VIX 11/19/2025 21.00 C 3010.81 10 1820.69 10 -1190.12 -0.40 VIX 12/17/2025 17.00 C 3910.81 10 4589.19 10 678.38 0.17 Open positions:\nSymbol Amount_Buy Quantity_Buy Total Opened Position Market Value: $36,760.23 Total Closed Position Market Value: $28,420.63 Net Profit/Loss: $-8,339.60 Percent Profit/Loss: -22.69%\nComplete Trade History (Closed Positions) Total Opened Position Market Value: $92,748.97 Total Closed Position Market Value: $92,590.88 Net Profit/Loss: $-158.09 Percent Profit/Loss: -0.17%\nReferences https://www.cboe.com/tradable_products/vix/ https://github.com/ranaroussi/yfinance Code Note: The files below are identical to those linked in part 1 and part 2.\nThe jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2025-03-03T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2025/03/03/investigating-a-vix-trading-signal-part-3-trading/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2025/03/03/investigating-a-vix-trading-signal-part-3-trading/","title":"Investigating A VIX Trading Signal, Part 3: Trading"},{"content":"Investigating A Signal Continuing from where we left off in part 1, we will now consider the idea of a spike level in the VIX and how we might use a spike level to generate a signal. These elevated levels usually occur during market sell-off events or longer term drawdowns in the S\u0026amp;P 500. Sometimes the VIX reverts to recent levels after a spike, but other times levels remain elevated for weeks or even months.\nDetermining A Spike Level We will start the 10 day simple moving average (SMA) of the daily high level to get an idea of what is happening recently with the VIX. We\u0026rsquo;ll then pick an arbitrary spike level (25% above the 10 day SMA), and our signal is generated if the VIX hits a level that is above the spike threshold.\nThe idea is that the 10 day SMA will smooth out the recent short term volatility in the VIX, and therefore any gradual increases in the VIX are not interpreted as spike events.\nWe also will generate the 20 and 50 day SMAs for reference, and again to see what is happening with the level of the VIX over slightly longer timeframes.\nHere\u0026rsquo;s the code for the above:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 # Define the spike multiplier for detecting significant spikes spike_level = 1.25 # ========================= # Simple Moving Averages (SMA) # ========================= # Calculate 10-period SMA of \u0026#39;High\u0026#39; vix[\u0026#39;High_SMA_10\u0026#39;] = vix[\u0026#39;High\u0026#39;].rolling(window=10).mean() # Shift the 10-period SMA by 1 to compare with current \u0026#39;High\u0026#39; vix[\u0026#39;High_SMA_10_Shift\u0026#39;] = vix[\u0026#39;High_SMA_10\u0026#39;].shift(1) # Calculate the spike level based on shifted SMA and spike multiplier vix[\u0026#39;Spike_Level_SMA\u0026#39;] = vix[\u0026#39;High_SMA_10_Shift\u0026#39;] * spike_level # Calculate 20-period SMA of \u0026#39;High\u0026#39; vix[\u0026#39;High_SMA_20\u0026#39;] = vix[\u0026#39;High\u0026#39;].rolling(window=20).mean() # Determine if \u0026#39;High\u0026#39; exceeds the spike level (indicates a spike) vix[\u0026#39;Spike_SMA\u0026#39;] = vix[\u0026#39;High\u0026#39;] \u0026gt;= vix[\u0026#39;Spike_Level_SMA\u0026#39;] # Calculate 50-period SMA of \u0026#39;High\u0026#39; for trend analysis vix[\u0026#39;High_SMA_50\u0026#39;] = vix[\u0026#39;High\u0026#39;].rolling(window=50).mean() # ========================= # Exponential Moving Averages (EMA) # ========================= # Calculate 10-period EMA of \u0026#39;High\u0026#39; vix[\u0026#39;High_EMA_10\u0026#39;] = vix[\u0026#39;High\u0026#39;].ewm(span=10, adjust=False).mean() # Shift the 10-period EMA by 1 to compare with current \u0026#39;High\u0026#39; vix[\u0026#39;High_EMA_10_Shift\u0026#39;] = vix[\u0026#39;High_EMA_10\u0026#39;].shift(1) # Calculate the spike level based on shifted EMA and spike multiplier vix[\u0026#39;Spike_Level_EMA\u0026#39;] = vix[\u0026#39;High_EMA_10_Shift\u0026#39;] * spike_level # Calculate 20-period EMA of \u0026#39;High\u0026#39; vix[\u0026#39;High_EMA_20\u0026#39;] = vix[\u0026#39;High\u0026#39;].ewm(span=20, adjust=False).mean() # Determine if \u0026#39;High\u0026#39; exceeds the spike level (indicates a spike) vix[\u0026#39;Spike_EMA\u0026#39;] = vix[\u0026#39;High\u0026#39;] \u0026gt;= vix[\u0026#39;Spike_Level_EMA\u0026#39;] # Calculate 50-period EMA of \u0026#39;High\u0026#39; for trend analysis vix[\u0026#39;High_EMA_50\u0026#39;] = vix[\u0026#39;High\u0026#39;].ewm(span=50, adjust=False).mean() For this exercise, we will use simple moving averages.\nSpike Counts (Signals) By Year To investigate the number of spike events (or signals) that we receive on a yearly basis, we can run the following:\n1 2 3 4 5 6 7 8 9 10 # Ensure the index is a DatetimeIndex vix.index = pd.to_datetime(vix.index) # Create a new column for the year extracted from the date index vix[\u0026#39;Year\u0026#39;] = vix.index.year # Group by year and the \u0026#34;Spike_SMA\u0026#34; and \u0026#34;Spike_EMA\u0026#34; columns, then count occurrences spike_count_SMA = vix.groupby([\u0026#39;Year\u0026#39;, \u0026#39;Spike_SMA\u0026#39;]).size().unstack(fill_value=0) display(spike_count_SMA) Which gives us the following:\nYear False True 1990 248 5 1991 249 4 1992 250 4 1993 251 2 1994 243 9 1995 252 0 1996 248 6 1997 247 6 1998 243 9 1999 250 2 2000 248 4 2001 240 8 2002 248 4 2003 251 1 2004 250 2 2005 250 2 2006 242 9 2007 239 12 2008 238 15 2009 249 3 2010 239 13 2011 240 12 2012 248 2 2013 249 3 2014 235 17 2015 240 12 2016 234 18 2017 244 7 2018 228 23 2019 241 11 2020 224 29 2021 235 17 2022 239 12 2023 246 4 2024 237 15 2025 203 17 And the plot to aid with visualization. Based on the plot, it seems as though volatility has increased since the early 2000\u0026rsquo;s:\nSpike Counts (Signals) Plots By Year The most recent yearly plots are shown below for when signals are generated. The images for the previous years are linked below.\nSpike/Signals, 1990 Spike/Signals, 1991 Spike/Signals, 1992 Spike/Signals, 1993 Spike/Signals, 1994 Spike/Signals, 1995 Spike/Signals, 1996 Spike/Signals, 1997 Spike/Signals, 1998 Spike/Signals, 1999 Spike/Signals, 2000 Spike/Signals, 2001 Spike/Signals, 2002 Spike/Signals, 2003 Spike/Signals, 2004 Spike/Signals, 2005 Spike/Signals, 2006 Spike/Signals, 2007 Spike/Signals, 2008 Spike/Signals, 2009 Spike/Signals, 2010 Spike/Signals, 2011 Spike/Signals, 2012 Spike/Signals, 2013 Spike/Signals, 2014 Spike/Signals, 2015 Spike/Signals, 2016 Spike/Signals, 2017 Spike/Signals, 2018 Spike/Signals, 2019\n2020 2021 2022 2023 2024 2025 For comparison with the VVIX plot for 2025:\nSpike Counts (Signals) Plots By Decade And here are the plots for the signals generated over the past 3 decades:\n1990 - 1994 1995 - 1999 2000 - 2004 2005 - 2009 2010 - 2014 2015 - 2019 2020 - 2024 2025 - Present For comparison with the VVIX plot for 2025:\nReferences https://www.cboe.com/tradable_products/vix/ https://github.com/ranaroussi/yfinance Code Note: The files below are identical to those linked in part 1.\nThe jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2025-03-02T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2025/03/02/investigating-a-vix-trading-signal-part-2-finding-a-signal/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2025/03/02/investigating-a-vix-trading-signal-part-2-finding-a-signal/","title":"Investigating A VIX Trading Signal, Part 2: Finding A Signal"},{"content":"Introduction From the CBOE VIX website:\n\u0026ldquo;Cboe Global Markets revolutionized investing with the creation of the Cboe Volatility Index® (VIX® Index), the first benchmark index to measure the market’s expectation of future volatility. The VIX Index is based on options of the S\u0026amp;P 500® Index, considered the leading indicator of the broad U.S. stock market. The VIX Index is recognized as the world’s premier gauge of U.S. equity market volatility.\u0026rdquo;\nIn this tutorial, we will investigate finding a signal to use as a basis to trade the VIX.\nVIX Data I don\u0026rsquo;t have access to data for the VIX through Nasdaq Data Link or any other data source, but for our purposes Yahoo Finance is sufficient. Using the yfinance python module, we can pull what we need and quicky dump it to excel to retain it for future use.\nPython Functions Here are the functions needed for this project:\ncalc_vix_trade_pnl: Calculates the profit/loss from VIX options trades. df_info: A simple function to display the information about a DataFrame and the first five rows and last five rows. df_info_markdown: Similar to the df_info function above, except that it coverts the output to markdown. export_track_md_deps: Exports various text outputs to markdown files, which are included in the index.md file created when building the site with Hugo. load_data: Load data from a CSV, Excel, or Pickle file into a pandas DataFrame. pandas_set_decimal_places: Set the number of decimal places displayed for floating-point numbers in pandas. plot_price: Plot the price data from a DataFrame for a specified date range and columns. plot_stats: Generate a scatter plot for the mean OHLC prices. plot_vix_with_trades: Plot the VIX daily high and low prices, along with the VIX spikes, and trades. yf_pull_data: Download daily price data from Yahoo Finance and export it. Data Overview (VIX) Acquire CBOE Volatility Index (VIX) Data First, let\u0026rsquo;s get the data:\n1 2 3 4 5 6 7 8 9 yf_pull_data( base_directory=DATA_DIR, ticker=\u0026#34;^VIX\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Indices\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) Load Data - VIX Now that we have the data, let\u0026rsquo;s load it up and take a look:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 # Set decimal places pandas_set_decimal_places(2) # VIX vix = load_data( base_directory=DATA_DIR, ticker=\u0026#34;^VIX\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Indices\u0026#34;, timeframe=\u0026#34;Daily\u0026#34;, ) # Set \u0026#39;Date\u0026#39; column as datetime vix[\u0026#39;Date\u0026#39;] = pd.to_datetime(vix[\u0026#39;Date\u0026#39;]) # Drop \u0026#39;Volume\u0026#39; vix.drop(columns = {\u0026#39;Volume\u0026#39;}, inplace = True) # Set Date as index vix.set_index(\u0026#39;Date\u0026#39;, inplace = True) # Check to see if there are any NaN values vix[vix[\u0026#39;High\u0026#39;].isna()] # Forward fill to clean up missing data vix[\u0026#39;High\u0026#39;] = vix[\u0026#39;High\u0026#39;].ffill() DataFrame Info - VIX Now, running:\n1 df_info(vix) Gives us the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 9037 entries, 1990-01-02 to 2025-11-17 Data columns (total 4 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Close 9037 non-null float64 1 High 9037 non-null float64 2 Low 9037 non-null float64 3 Open 9037 non-null float64 dtypes: float64(4) memory usage: 353.0 KB The first 5 rows are:\nDate Close High Low Open 1990-01-02 00:00:00 17.24 17.24 17.24 17.24 1990-01-03 00:00:00 18.19 18.19 18.19 18.19 1990-01-04 00:00:00 19.22 19.22 19.22 19.22 1990-01-05 00:00:00 20.11 20.11 20.11 20.11 1990-01-08 00:00:00 20.26 20.26 20.26 20.26 The last 5 rows are:\nDate Close High Low Open 2025-11-11 00:00:00 17.28 18.01 17.25 17.90 2025-11-12 00:00:00 17.51 18.06 17.10 17.21 2025-11-13 00:00:00 20.00 21.31 17.51 17.61 2025-11-14 00:00:00 19.83 23.03 19.56 21.33 2025-11-17 00:00:00 22.38 23.44 19.54 19.58 Statistics - VIX Some interesting statistics jump out at us when we look at the mean, standard deviation, minimum, and maximum values for the full dataset. The following code:\n1 2 3 4 5 6 7 8 9 10 vix_stats = vix.describe() num_std = [-1, 0, 1, 2, 3, 4, 5] for num in num_std: vix_stats.loc[f\u0026#34;mean + {num} std\u0026#34;] = { \u0026#39;Open\u0026#39;: vix_stats.loc[\u0026#39;mean\u0026#39;][\u0026#39;Open\u0026#39;] + num * vix_stats.loc[\u0026#39;std\u0026#39;][\u0026#39;Open\u0026#39;], \u0026#39;High\u0026#39;: vix_stats.loc[\u0026#39;mean\u0026#39;][\u0026#39;High\u0026#39;] + num * vix_stats.loc[\u0026#39;std\u0026#39;][\u0026#39;High\u0026#39;], \u0026#39;Low\u0026#39;: vix_stats.loc[\u0026#39;mean\u0026#39;][\u0026#39;Low\u0026#39;] + num * vix_stats.loc[\u0026#39;std\u0026#39;][\u0026#39;Low\u0026#39;], \u0026#39;Close\u0026#39;: vix_stats.loc[\u0026#39;mean\u0026#39;][\u0026#39;Close\u0026#39;] + num * vix_stats.loc[\u0026#39;std\u0026#39;][\u0026#39;Close\u0026#39;], } display(vix_stats) Gives us:\nClose High Low Open count 9037.00 9037.00 9037.00 9037.00 mean 19.46 20.38 18.79 19.56 std 7.79 8.35 7.35 7.87 min 9.14 9.31 8.56 9.01 25% 13.92 14.58 13.44 13.96 50% 17.61 18.33 17.02 17.66 75% 22.76 23.76 22.09 22.92 max 82.69 89.53 72.76 82.69 mean + -1 std 11.67 12.02 11.44 11.69 mean + 0 std 19.46 20.38 18.79 19.56 mean + 1 std 27.26 28.73 26.14 27.43 mean + 2 std 35.05 37.08 33.49 35.29 mean + 3 std 42.85 45.43 40.84 43.16 mean + 4 std 50.64 53.78 48.19 51.03 mean + 5 std 58.44 62.13 55.54 58.90 We can also run the statistics individually for each year:\n1 2 3 4 5 6 7 8 # Group by year and calculate mean and std for OHLC vix_stats_by_year = vix.groupby(vix.index.year)[[\u0026#34;Open\u0026#34;, \u0026#34;High\u0026#34;, \u0026#34;Low\u0026#34;, \u0026#34;Close\u0026#34;]].agg([\u0026#34;mean\u0026#34;, \u0026#34;std\u0026#34;]) # Flatten the column MultiIndex vix_stats_by_year.columns = [\u0026#39;_\u0026#39;.join(col).strip() for col in vix_stats_by_year.columns.values] vix_stats_by_year.index.name = \u0026#34;Year\u0026#34; display(vix_stats_by_year) Gives us:\nYear Open_mean Open_std Open_min Open_max High_mean High_std High_min High_max Low_mean Low_std Low_min Low_max Close_mean Close_std Close_min Close_max 1990 23.06 4.74 14.72 36.47 23.06 4.74 14.72 36.47 23.06 4.74 14.72 36.47 23.06 4.74 14.72 36.47 1991 18.38 3.68 13.95 36.20 18.38 3.68 13.95 36.20 18.38 3.68 13.95 36.20 18.38 3.68 13.95 36.20 1992 15.23 2.26 10.29 20.67 16.03 2.19 11.90 25.13 14.85 2.14 10.29 19.67 15.45 2.12 11.51 21.02 1993 12.70 1.37 9.18 16.20 13.34 1.40 9.55 18.31 12.25 1.28 8.89 15.77 12.69 1.33 9.31 17.30 1994 13.79 2.06 9.86 23.61 14.58 2.28 10.31 28.30 13.38 1.99 9.59 23.61 13.93 2.07 9.94 23.87 1995 12.27 1.03 10.29 15.79 12.93 1.07 10.95 16.99 11.96 0.98 10.06 14.97 12.39 0.97 10.36 15.74 1996 16.31 1.92 11.24 23.90 16.99 2.12 12.29 27.05 15.94 1.82 11.11 21.43 16.44 1.94 12.00 21.99 1997 22.43 4.33 16.67 45.69 23.11 4.56 18.02 48.64 21.85 3.98 16.36 36.43 22.38 4.14 17.09 38.20 1998 25.68 6.96 16.42 47.95 26.61 7.36 16.50 49.53 24.89 6.58 16.10 45.58 25.60 6.86 16.23 45.74 1999 24.39 2.90 18.05 32.62 25.20 3.01 18.48 33.66 23.75 2.76 17.07 31.13 24.37 2.88 17.42 32.98 2000 23.41 3.43 16.81 33.70 24.10 3.66 17.06 34.31 22.75 3.19 16.28 30.56 23.32 3.41 16.53 33.49 2001 26.04 4.98 19.21 48.93 26.64 5.19 19.37 49.35 25.22 4.61 18.74 42.66 25.75 4.78 18.76 43.74 2002 27.53 7.03 17.23 48.17 28.28 7.25 17.51 48.46 26.60 6.64 17.02 42.05 27.29 6.91 17.40 45.08 2003 22.21 5.31 15.59 35.21 22.61 5.35 16.19 35.66 21.64 5.18 14.66 33.99 21.98 5.24 15.58 34.69 2004 15.59 1.93 11.41 21.06 16.05 2.02 11.64 22.67 15.05 1.79 11.14 20.61 15.48 1.92 11.23 21.58 2005 12.84 1.44 10.23 18.33 13.28 1.59 10.48 18.59 12.39 1.32 9.88 16.41 12.81 1.47 10.23 17.74 2006 12.90 2.18 9.68 23.45 13.33 2.46 10.06 23.81 12.38 1.96 9.39 21.45 12.81 2.25 9.90 23.81 2007 17.59 5.36 9.99 32.68 18.44 5.76 10.26 37.50 16.75 4.95 9.70 30.44 17.54 5.36 9.89 31.09 2008 32.83 16.41 16.30 80.74 34.57 17.83 17.84 89.53 30.96 14.96 15.82 72.76 32.69 16.38 16.30 80.86 2009 31.75 9.20 19.54 52.65 32.78 9.61 19.67 57.36 30.50 8.63 19.25 49.27 31.48 9.08 19.47 56.65 2010 22.73 5.29 15.44 47.66 23.69 5.82 16.00 48.20 21.69 4.61 15.23 40.30 22.55 5.27 15.45 45.79 2011 24.27 8.17 14.31 46.18 25.40 8.78 14.99 48.00 23.15 7.59 14.27 41.51 24.20 8.14 14.62 48.00 2012 17.93 2.60 13.68 26.35 18.59 2.72 14.08 27.73 17.21 2.37 13.30 25.72 17.80 2.54 13.45 26.66 2013 14.29 1.67 11.52 20.87 14.82 1.88 11.75 21.91 13.80 1.51 11.05 19.04 14.23 1.74 11.30 20.49 2014 14.23 2.65 10.40 29.26 14.95 3.02 10.76 31.06 13.61 2.21 10.28 24.64 14.17 2.62 10.32 25.27 2015 16.71 3.99 11.77 31.91 17.79 5.03 12.22 53.29 15.85 3.65 10.88 29.91 16.67 4.34 11.95 40.74 2016 16.01 4.05 11.32 29.01 16.85 4.40 11.49 32.09 15.16 3.66 10.93 26.67 15.83 3.97 11.27 28.14 2017 11.14 1.34 9.23 16.19 11.72 1.54 9.52 17.28 10.64 1.16 8.56 14.97 11.09 1.36 9.14 16.04 2018 16.63 5.01 9.01 37.32 18.03 6.12 9.31 50.30 15.53 4.25 8.92 29.66 16.64 5.09 9.15 37.32 2019 15.57 2.74 11.55 27.54 16.41 3.06 11.79 28.53 14.76 2.38 11.03 24.05 15.39 2.61 11.54 25.45 2020 29.52 12.45 12.20 82.69 31.46 13.89 12.42 85.47 27.50 10.85 11.75 70.37 29.25 12.34 12.10 82.69 2021 19.83 3.47 15.02 35.16 21.12 4.22 15.54 37.51 18.65 2.93 14.10 29.24 19.66 3.62 15.01 37.21 2022 25.98 4.30 16.57 37.50 27.25 4.59 17.81 38.94 24.69 3.91 16.34 33.11 25.62 4.22 16.60 36.45 2023 17.12 3.17 11.96 27.77 17.83 3.58 12.46 30.81 16.36 2.89 11.81 24.00 16.87 3.14 12.07 26.52 2024 15.69 3.14 11.53 33.71 16.65 4.73 12.23 65.73 14.92 2.58 10.62 24.02 15.61 3.36 11.86 38.57 2025 19.43 5.78 14.31 60.13 20.72 7.00 14.69 60.13 18.30 4.35 14.12 38.58 19.22 5.49 14.22 52.33 It is interesting to see how much the mean OHLC values vary by year.\nAnd finally, we can run the statistics individually for each month:\n1 2 3 4 5 6 7 8 # Group by month and calculate mean and std for OHLC vix_stats_by_month = vix.groupby(vix.index.month)[[\u0026#34;Open\u0026#34;, \u0026#34;High\u0026#34;, \u0026#34;Low\u0026#34;, \u0026#34;Close\u0026#34;]].agg([\u0026#34;mean\u0026#34;, \u0026#34;std\u0026#34;]) # Flatten the column MultiIndex vix_stats_by_month.columns = [\u0026#39;_\u0026#39;.join(col).strip() for col in vix_stats_by_month.columns.values] vix_stats_by_month.index.name = \u0026#34;Month\u0026#34; display(vix_stats_by_month) Gives us:\nMonth Open_mean Open_std Open_min Open_max High_mean High_std High_min High_max Low_mean Low_std Low_min Low_max Close_mean Close_std Close_min Close_max 1 19.34 7.21 9.01 51.52 20.13 7.58 9.31 57.36 18.60 6.87 8.92 49.27 19.22 7.17 9.15 56.65 2 19.67 7.22 10.19 52.50 20.51 7.65 10.26 53.16 18.90 6.81 9.70 48.97 19.58 7.13 10.02 52.62 3 20.47 9.63 10.59 82.69 21.39 10.49 11.24 85.47 19.54 8.65 10.53 70.37 20.35 9.56 10.74 82.69 4 19.43 7.48 10.39 60.13 20.24 7.93 10.89 60.59 18.65 6.88 10.22 52.76 19.29 7.28 10.36 57.06 5 18.60 6.04 9.75 47.66 19.40 6.43 10.14 48.20 17.89 5.63 9.56 40.30 18.51 5.96 9.77 45.79 6 18.46 5.75 9.79 44.09 19.15 6.02 10.28 44.44 17.73 5.40 9.37 34.97 18.34 5.68 9.75 40.79 7 17.83 5.67 9.18 48.17 18.53 5.90 9.52 48.46 17.21 5.41 8.84 42.05 17.76 5.60 9.36 44.92 8 19.09 6.67 10.04 45.34 20.03 7.38 10.32 65.73 18.35 6.32 9.52 41.77 19.09 6.80 9.93 48.00 9 20.37 8.23 9.59 48.93 21.21 8.55 9.83 49.35 19.62 7.82 9.36 43.74 20.29 8.12 9.51 46.72 10 21.72 10.16 9.23 79.13 22.73 10.97 9.62 89.53 20.82 9.40 9.11 67.80 21.64 10.12 9.19 80.06 11 20.32 9.58 9.31 80.74 21.03 9.96 9.74 81.48 19.53 8.95 8.56 72.76 20.14 9.45 9.14 80.86 12 19.34 8.26 9.36 66.68 20.09 8.53 9.55 68.60 18.63 7.88 8.89 62.31 19.29 8.16 9.31 68.51 Deciles - VIX Here are the levels for each decile, for the full dataset:\n1 2 vix_deciles = vix.quantile(np.arange(0, 1.1, 0.1)) display(vix_deciles) Gives us:\nClose High Low Open 0.00 9.14 9.31 8.56 9.01 0.10 12.14 12.65 11.73 12.15 0.20 13.30 13.89 12.89 13.34 0.30 14.66 15.35 14.15 14.73 0.40 16.11 16.77 15.58 16.15 0.50 17.61 18.33 17.02 17.66 0.60 19.49 20.34 18.93 19.61 0.70 21.56 22.57 20.91 21.68 0.80 24.25 25.23 23.40 24.33 0.90 28.63 29.91 27.67 28.80 1.00 82.69 89.53 72.76 82.69 We can also run the deciles individually for each year (note: the markdown export is messy):\n1 2 3 4 # Group by year for deciles vix_deciles_by_year = vix.groupby(vix.index.year)[[\u0026#34;Open\u0026#34;, \u0026#34;High\u0026#34;, \u0026#34;Low\u0026#34;, \u0026#34;Close\u0026#34;]].quantile(np.arange(0, 1.1, 0.1)) display(vix_deciles_by_year) Open High Low Close (1990, 0.0) 14.72 14.72 14.72 14.72 (1990, 0.1) 17.18 17.18 17.18 17.18 (1990, 0.2) 18.47 18.47 18.47 18.47 (1990, 0.30000000000000004) 20.08 20.08 20.08 20.08 (1990, 0.4) 21.15 21.15 21.15 21.15 (1990, 0.5) 22.57 22.57 22.57 22.57 (1990, 0.6000000000000001) 23.76 23.76 23.76 23.76 (1990, 0.7000000000000001) 25.37 25.37 25.37 25.37 (1990, 0.8) 28.05 28.05 28.05 28.05 (1990, 0.9) 29.88 29.88 29.88 29.88 (1990, 1.0) 36.47 36.47 36.47 36.47 (1991, 0.0) 13.95 13.95 13.95 13.95 (1991, 0.1) 15.45 15.45 15.45 15.45 (1991, 0.2) 15.78 15.78 15.78 15.78 (1991, 0.30000000000000004) 16.26 16.26 16.26 16.26 (1991, 0.4) 16.78 16.78 16.78 16.78 (1991, 0.5) 17.44 17.44 17.44 17.44 (1991, 0.6000000000000001) 17.81 17.81 17.81 17.81 (1991, 0.7000000000000001) 18.84 18.84 18.84 18.84 (1991, 0.8) 20.39 20.39 20.39 20.39 (1991, 0.9) 22.25 22.25 22.25 22.25 (1991, 1.0) 36.20 36.20 36.20 36.20 (1992, 0.0) 10.29 11.90 10.29 11.51 (1992, 0.1) 12.30 13.30 12.14 12.62 (1992, 0.2) 12.99 13.93 12.77 13.48 (1992, 0.30000000000000004) 13.87 14.70 13.50 13.94 (1992, 0.4) 14.37 15.28 14.10 14.63 (1992, 0.5) 15.36 15.96 14.87 15.36 (1992, 0.6000000000000001) 15.88 16.69 15.58 16.24 (1992, 0.7000000000000001) 16.43 17.17 16.10 16.66 (1992, 0.8) 17.09 17.75 16.61 17.20 (1992, 0.9) 18.23 18.89 17.55 18.40 (1992, 1.0) 20.67 25.13 19.67 21.02 (1993, 0.0) 9.18 9.55 8.89 9.31 (1993, 0.1) 11.10 11.72 10.71 11.21 (1993, 0.2) 11.61 12.16 11.22 11.54 (1993, 0.30000000000000004) 11.89 12.41 11.52 11.88 (1993, 0.4) 12.23 12.82 11.79 12.16 (1993, 0.5) 12.58 13.18 12.15 12.43 (1993, 0.6000000000000001) 13.04 13.71 12.50 12.90 (1993, 0.7000000000000001) 13.36 14.11 12.92 13.36 (1993, 0.8) 13.77 14.63 13.35 13.85 (1993, 0.9) 14.62 15.07 14.00 14.51 (1993, 1.0) 16.20 18.31 15.77 17.30 (1994, 0.0) 9.86 10.31 9.59 9.94 (1994, 0.1) 11.43 11.89 11.14 11.39 (1994, 0.2) 11.81 12.44 11.50 11.89 (1994, 0.30000000000000004) 12.48 13.02 11.86 12.49 (1994, 0.4) 12.95 13.73 12.62 13.10 (1994, 0.5) 13.54 14.51 13.14 13.86 (1994, 0.6000000000000001) 14.25 15.17 13.84 14.49 (1994, 0.7000000000000001) 15.00 15.63 14.40 14.92 (1994, 0.8) 15.63 16.62 15.30 15.98 (1994, 0.9) 16.43 17.27 16.02 16.60 (1994, 1.0) 23.61 28.30 23.61 23.87 (1995, 0.0) 10.29 10.95 10.06 10.36 (1995, 0.1) 11.12 11.71 10.89 11.25 (1995, 0.2) 11.37 11.96 11.10 11.51 (1995, 0.30000000000000004) 11.64 12.25 11.31 11.74 (1995, 0.4) 11.89 12.54 11.61 12.04 (1995, 0.5) 12.13 12.80 11.81 12.30 (1995, 0.6000000000000001) 12.34 13.04 12.11 12.52 (1995, 0.7000000000000001) 12.72 13.35 12.40 12.81 (1995, 0.8) 13.11 13.80 12.79 13.15 (1995, 0.9) 13.62 14.37 13.27 13.65 (1995, 1.0) 15.79 16.99 14.97 15.74 (1996, 0.0) 11.24 12.29 11.11 12.00 (1996, 0.1) 13.97 14.67 13.64 14.13 (1996, 0.2) 14.80 15.46 14.60 14.97 (1996, 0.30000000000000004) 15.43 16.01 14.99 15.46 (1996, 0.4) 15.89 16.37 15.51 15.94 (1996, 0.5) 16.29 16.80 15.93 16.24 (1996, 0.6000000000000001) 16.66 17.16 16.25 16.78 (1996, 0.7000000000000001) 16.99 17.67 16.73 17.10 (1996, 0.8) 17.77 18.64 17.31 17.93 (1996, 0.9) 18.69 19.72 18.28 19.03 (1996, 1.0) 23.90 27.05 21.43 21.99 (1997, 0.0) 16.67 18.02 16.36 17.09 (1997, 0.1) 19.01 19.47 18.63 19.02 (1997, 0.2) 19.51 20.04 19.14 19.47 (1997, 0.30000000000000004) 19.87 20.51 19.48 19.90 (1997, 0.4) 20.37 20.96 19.88 20.39 (1997, 0.5) 21.01 21.54 20.49 20.95 (1997, 0.6000000000000001) 21.54 22.24 21.03 21.53 (1997, 0.7000000000000001) 22.85 23.40 22.05 22.75 (1997, 0.8) 24.66 25.17 24.04 24.49 (1997, 0.9) 28.71 28.92 27.29 28.43 (1997, 1.0) 45.69 48.64 36.43 38.20 (1998, 0.0) 16.42 16.50 16.10 16.23 (1998, 0.1) 19.07 19.56 18.76 18.99 (1998, 0.2) 20.09 20.66 19.63 20.18 (1998, 0.30000000000000004) 21.01 21.60 20.39 21.00 (1998, 0.4) 21.99 22.89 21.55 22.06 (1998, 0.5) 23.30 24.01 22.62 23.14 (1998, 0.6000000000000001) 24.91 25.99 24.06 24.94 (1998, 0.7000000000000001) 27.48 28.72 26.80 27.89 (1998, 0.8) 31.12 32.52 29.54 31.12 (1998, 0.9) 37.78 39.42 36.05 36.62 (1998, 1.0) 47.95 49.53 45.58 45.74 (1999, 0.0) 18.05 18.48 17.07 17.42 (1999, 0.1) 20.98 21.64 20.56 21.09 (1999, 0.2) 22.07 22.84 21.55 21.91 (1999, 0.30000000000000004) 22.60 23.35 22.12 22.71 (1999, 0.4) 23.39 24.04 22.78 23.30 (1999, 0.5) 24.12 24.82 23.43 24.10 (1999, 0.6000000000000001) 24.79 25.55 24.21 24.79 (1999, 0.7000000000000001) 25.57 26.49 24.86 25.58 (1999, 0.8) 26.66 27.77 25.94 26.62 (1999, 0.9) 28.57 29.61 27.63 28.35 (1999, 1.0) 32.62 33.66 31.13 32.98 (2000, 0.0) 16.81 17.06 16.28 16.53 (2000, 0.1) 19.03 19.29 18.77 18.95 (2000, 0.2) 20.20 20.52 19.71 20.03 (2000, 0.30000000000000004) 21.29 21.75 20.71 21.24 (2000, 0.4) 22.20 22.98 21.62 22.23 (2000, 0.5) 23.39 24.14 22.73 23.24 (2000, 0.6000000000000001) 24.34 25.02 23.59 24.30 (2000, 0.7000000000000001) 25.33 26.14 24.61 25.08 (2000, 0.8) 26.54 27.41 25.85 26.53 (2000, 0.9) 27.97 28.83 27.05 27.69 (2000, 1.0) 33.70 34.31 30.56 33.49 (2001, 0.0) 19.21 19.37 18.74 18.76 (2001, 0.1) 21.04 21.55 20.52 20.74 (2001, 0.2) 21.92 22.36 21.43 21.81 (2001, 0.30000000000000004) 22.64 23.13 22.18 22.57 (2001, 0.4) 23.49 24.13 23.00 23.53 (2001, 0.5) 24.57 25.22 24.04 24.26 (2001, 0.6000000000000001) 25.62 26.18 25.02 25.62 (2001, 0.7000000000000001) 27.95 28.72 27.07 27.43 (2001, 0.8) 30.23 31.05 29.04 29.82 (2001, 0.9) 32.94 34.11 32.05 32.27 (2001, 1.0) 48.93 49.35 42.66 43.74 (2002, 0.0) 17.23 17.51 17.02 17.40 (2002, 0.1) 19.31 19.81 19.00 19.30 (2002, 0.2) 20.73 21.07 20.10 20.39 (2002, 0.30000000000000004) 22.05 22.62 21.35 21.92 (2002, 0.4) 23.93 24.61 22.96 23.75 (2002, 0.5) 26.35 27.37 25.74 26.39 (2002, 0.6000000000000001) 28.57 29.59 27.84 28.53 (2002, 0.7000000000000001) 31.17 31.71 29.96 30.81 (2002, 0.8) 34.80 35.60 33.62 34.10 (2002, 0.9) 37.52 38.77 36.18 37.33 (2002, 1.0) 48.17 48.46 42.05 45.08 (2003, 0.0) 15.59 16.19 14.66 15.58 (2003, 0.1) 17.05 17.42 16.51 16.82 (2003, 0.2) 18.00 18.34 17.59 17.76 (2003, 0.30000000000000004) 18.91 19.35 18.38 18.67 (2003, 0.4) 19.52 19.87 19.10 19.40 (2003, 0.5) 20.06 20.46 19.61 19.85 (2003, 0.6000000000000001) 20.97 21.28 20.44 20.80 (2003, 0.7000000000000001) 22.43 22.85 21.79 22.25 (2003, 0.8) 28.30 28.68 27.52 27.92 (2003, 0.9) 31.45 31.97 30.83 31.25 (2003, 1.0) 35.21 35.66 33.99 34.69 (2004, 0.0) 11.41 11.64 11.14 11.23 (2004, 0.1) 13.07 13.39 12.77 13.09 (2004, 0.2) 13.98 14.41 13.62 13.92 (2004, 0.30000000000000004) 14.73 15.11 14.17 14.55 (2004, 0.4) 15.22 15.57 14.58 14.97 (2004, 0.5) 15.45 15.97 14.98 15.32 (2004, 0.6000000000000001) 15.88 16.48 15.32 15.76 (2004, 0.7000000000000001) 16.37 16.81 15.79 16.27 (2004, 0.8) 16.86 17.53 16.22 16.77 (2004, 0.9) 18.45 18.86 17.61 18.13 (2004, 1.0) 21.06 22.67 20.61 21.58 (2005, 0.0) 10.23 10.48 9.88 10.23 (2005, 0.1) 11.09 11.38 10.81 11.10 (2005, 0.2) 11.58 11.92 11.22 11.52 (2005, 0.30000000000000004) 11.97 12.30 11.52 11.91 (2005, 0.4) 12.23 12.61 11.91 12.25 (2005, 0.5) 12.68 12.99 12.23 12.52 (2005, 0.6000000000000001) 13.12 13.47 12.57 13.09 (2005, 0.7000000000000001) 13.57 13.92 13.04 13.42 (2005, 0.8) 14.03 14.44 13.46 14.04 (2005, 0.9) 14.85 15.66 14.27 14.84 (2005, 1.0) 18.33 18.59 16.41 17.74 (2006, 0.0) 9.68 10.06 9.39 9.90 (2006, 0.1) 11.01 11.26 10.59 10.79 (2006, 0.2) 11.30 11.57 10.92 11.19 (2006, 0.30000000000000004) 11.51 11.89 11.19 11.52 (2006, 0.4) 11.92 12.17 11.51 11.75 (2006, 0.5) 12.22 12.47 11.78 12.00 (2006, 0.6000000000000001) 12.52 12.91 12.11 12.39 (2006, 0.7000000000000001) 13.09 13.61 12.71 13.04 (2006, 0.8) 14.44 14.93 13.86 14.33 (2006, 0.9) 16.16 17.08 15.27 16.21 (2006, 1.0) 23.45 23.81 21.45 23.81 (2007, 0.0) 9.99 10.26 9.70 9.89 (2007, 0.1) 11.09 11.48 10.52 11.10 (2007, 0.2) 12.77 13.27 12.48 12.83 (2007, 0.30000000000000004) 13.48 13.97 12.92 13.42 (2007, 0.4) 14.91 15.51 14.24 14.72 (2007, 0.5) 16.39 17.28 15.31 16.43 (2007, 0.6000000000000001) 18.65 19.42 17.49 18.53 (2007, 0.7000000000000001) 20.47 22.09 19.68 20.74 (2007, 0.8) 23.35 24.17 22.00 22.96 (2007, 0.9) 25.79 26.72 23.83 25.25 (2007, 1.0) 32.68 37.50 30.44 31.09 (2008, 0.0) 16.30 17.84 15.82 16.30 (2008, 0.1) 19.81 20.55 19.05 19.66 (2008, 0.2) 21.18 21.85 20.36 21.04 (2008, 0.30000000000000004) 22.65 23.47 21.80 22.64 (2008, 0.4) 23.76 24.42 22.70 23.52 (2008, 0.5) 25.38 25.96 23.90 25.10 (2008, 0.6000000000000001) 26.62 28.37 25.46 26.73 (2008, 0.7000000000000001) 32.30 33.91 28.97 31.38 (2008, 0.8) 49.00 53.08 44.87 49.00 (2008, 0.9) 62.52 65.78 58.33 60.86 (2008, 1.0) 80.74 89.53 72.76 80.86 (2009, 0.0) 19.54 19.67 19.25 19.47 (2009, 0.1) 21.99 22.76 21.50 22.11 (2009, 0.2) 23.88 24.35 22.99 23.57 (2009, 0.30000000000000004) 24.91 25.72 24.26 24.78 (2009, 0.4) 25.92 26.64 24.95 25.64 (2009, 0.5) 28.95 29.56 27.27 28.57 (2009, 0.6000000000000001) 31.12 32.64 30.04 31.11 (2009, 0.7000000000000001) 37.20 38.03 36.07 36.73 (2009, 0.8) 42.21 43.67 40.75 42.25 (2009, 0.9) 45.71 47.33 43.82 45.43 (2009, 1.0) 52.65 57.36 49.27 56.65 (2010, 0.0) 15.44 16.00 15.23 15.45 (2010, 0.1) 17.22 17.85 16.79 17.29 (2010, 0.2) 18.04 18.55 17.56 17.91 (2010, 0.30000000000000004) 19.26 19.70 18.44 18.86 (2010, 0.4) 20.47 21.42 19.60 20.21 (2010, 0.5) 21.94 22.60 21.24 21.72 (2010, 0.6000000000000001) 22.88 23.75 22.17 22.69 (2010, 0.7000000000000001) 24.45 25.61 23.31 24.40 (2010, 0.8) 26.32 27.24 24.98 25.99 (2010, 0.9) 29.52 31.14 28.06 29.63 (2010, 1.0) 47.66 48.20 40.30 45.79 (2011, 0.0) 14.31 14.99 14.27 14.62 (2011, 0.1) 16.29 16.74 15.84 16.06 (2011, 0.2) 17.00 17.73 16.34 17.07 (2011, 0.30000000000000004) 17.91 18.57 17.14 17.75 (2011, 0.4) 18.95 19.70 17.94 18.94 (2011, 0.5) 20.66 21.62 19.80 20.72 (2011, 0.6000000000000001) 24.06 25.15 22.96 24.31 (2011, 0.7000000000000001) 29.62 31.57 28.72 29.93 (2011, 0.8) 32.96 34.19 31.47 32.64 (2011, 0.9) 36.53 38.50 34.27 36.20 (2011, 1.0) 46.18 48.00 41.51 48.00 (2012, 0.0) 13.68 14.08 13.30 13.45 (2012, 0.1) 14.96 15.50 14.49 15.04 (2012, 0.2) 15.60 16.19 15.13 15.56 (2012, 0.30000000000000004) 16.12 16.68 15.70 16.26 (2012, 0.4) 16.96 17.44 16.30 16.72 (2012, 0.5) 17.65 18.12 16.96 17.52 (2012, 0.6000000000000001) 18.18 18.81 17.60 18.07 (2012, 0.7000000000000001) 18.89 19.62 17.98 18.63 (2012, 0.8) 19.91 20.70 18.96 19.55 (2012, 0.9) 21.67 22.69 20.76 21.49 (2012, 1.0) 26.35 27.73 25.72 26.66 (2013, 0.0) 11.52 11.75 11.05 11.30 (2013, 0.1) 12.66 12.99 12.32 12.53 (2013, 0.2) 12.93 13.33 12.62 12.82 (2013, 0.30000000000000004) 13.24 13.63 12.85 13.15 (2013, 0.4) 13.47 13.89 13.10 13.48 (2013, 0.5) 13.72 14.22 13.38 13.74 (2013, 0.6000000000000001) 14.12 14.62 13.74 14.07 (2013, 0.7000000000000001) 14.73 15.23 14.14 14.64 (2013, 0.8) 15.89 16.44 14.98 15.61 (2013, 0.9) 16.79 17.48 16.15 16.64 (2013, 1.0) 20.87 21.91 19.04 20.49 (2014, 0.0) 10.40 10.76 10.28 10.32 (2014, 0.1) 11.69 12.06 11.41 11.65 (2014, 0.2) 12.24 12.62 11.88 12.13 (2014, 0.30000000000000004) 12.66 13.23 12.24 12.64 (2014, 0.4) 13.16 13.78 12.83 13.13 (2014, 0.5) 13.80 14.29 13.28 13.67 (2014, 0.6000000000000001) 14.22 14.83 13.79 14.14 (2014, 0.7000000000000001) 14.75 15.58 14.14 14.71 (2014, 0.8) 15.68 16.69 15.08 15.64 (2014, 0.9) 16.94 18.06 16.00 17.23 (2014, 1.0) 29.26 31.06 24.64 25.27 (2015, 0.0) 11.77 12.22 10.88 11.95 (2015, 0.1) 12.98 13.40 12.49 12.75 (2015, 0.2) 13.50 14.08 12.89 13.39 (2015, 0.30000000000000004) 14.13 14.75 13.41 13.97 (2015, 0.4) 14.93 15.63 14.15 14.67 (2015, 0.5) 15.55 16.29 14.86 15.32 (2015, 0.6000000000000001) 16.42 17.19 15.56 16.09 (2015, 0.7000000000000001) 17.56 18.67 16.60 17.32 (2015, 0.8) 19.18 20.41 18.11 19.39 (2015, 0.9) 22.78 24.26 21.08 22.44 (2015, 1.0) 31.91 53.29 29.91 40.74 (2016, 0.0) 11.32 11.49 10.93 11.27 (2016, 0.1) 12.23 12.78 11.79 12.15 (2016, 0.2) 12.86 13.50 12.30 12.83 (2016, 0.30000000000000004) 13.39 14.06 12.87 13.34 (2016, 0.4) 13.86 14.59 13.29 13.72 (2016, 0.5) 14.57 15.39 13.75 14.31 (2016, 0.6000000000000001) 15.51 16.30 14.69 15.23 (2016, 0.7000000000000001) 16.49 17.48 15.66 16.32 (2016, 0.8) 19.52 20.31 18.23 18.75 (2016, 0.9) 22.27 23.79 21.22 22.29 (2016, 1.0) 29.01 32.09 26.67 28.14 (2017, 0.0) 9.23 9.52 8.56 9.14 (2017, 0.1) 9.75 10.14 9.43 9.73 (2017, 0.2) 9.98 10.41 9.68 9.97 (2017, 0.30000000000000004) 10.22 10.74 9.87 10.20 (2017, 0.4) 10.54 11.04 10.12 10.49 (2017, 0.5) 10.89 11.40 10.34 10.85 (2017, 0.6000000000000001) 11.36 11.84 10.80 11.23 (2017, 0.7000000000000001) 11.61 12.18 11.12 11.49 (2017, 0.8) 11.96 12.61 11.45 11.81 (2017, 0.9) 12.60 14.05 11.79 12.54 (2017, 1.0) 16.19 17.28 14.97 16.04 (2018, 0.0) 9.01 9.31 8.92 9.15 (2018, 0.1) 11.91 12.41 11.31 11.80 (2018, 0.2) 12.46 13.07 11.93 12.37 (2018, 0.30000000000000004) 13.03 13.74 12.42 12.89 (2018, 0.4) 13.93 14.63 13.09 13.63 (2018, 0.5) 15.37 16.72 14.57 15.49 (2018, 0.6000000000000001) 16.63 18.19 15.79 16.85 (2018, 0.7000000000000001) 18.52 20.60 17.68 18.90 (2018, 0.8) 20.76 22.31 19.57 20.60 (2018, 0.9) 23.03 25.88 20.94 23.34 (2018, 1.0) 37.32 50.30 29.66 37.32 (2019, 0.0) 11.55 11.79 11.03 11.54 (2019, 0.1) 12.65 13.10 12.24 12.61 (2019, 0.2) 13.11 13.69 12.56 13.02 (2019, 0.30000000000000004) 13.73 14.30 13.14 13.56 (2019, 0.4) 14.22 14.98 13.60 14.23 (2019, 0.5) 14.92 15.86 14.39 14.87 (2019, 0.6000000000000001) 15.66 16.50 15.02 15.67 (2019, 0.7000000000000001) 16.32 17.63 15.56 16.23 (2019, 0.8) 17.86 19.13 16.94 17.56 (2019, 0.9) 19.45 21.07 18.28 19.13 (2019, 1.0) 27.54 28.53 24.05 25.45 (2020, 0.0) 12.20 12.42 11.75 12.10 (2020, 0.1) 15.72 16.44 14.95 15.47 (2020, 0.2) 22.11 22.90 21.16 21.97 (2020, 0.30000000000000004) 23.44 24.50 22.25 23.21 (2020, 0.4) 25.30 26.57 24.14 25.14 (2020, 0.5) 27.10 28.44 25.53 26.70 (2020, 0.6000000000000001) 28.61 30.17 27.14 28.02 (2020, 0.7000000000000001) 30.97 33.10 28.37 31.29 (2020, 0.8) 34.96 37.95 31.91 34.60 (2020, 0.9) 44.15 45.73 41.10 41.97 (2020, 1.0) 82.69 85.47 70.37 82.69 (2021, 0.0) 15.02 15.54 14.10 15.01 (2021, 0.1) 16.27 16.98 15.68 16.18 (2021, 0.2) 16.96 17.65 16.14 16.67 (2021, 0.30000000000000004) 17.43 18.18 16.71 17.28 (2021, 0.4) 18.00 19.05 17.21 17.91 (2021, 0.5) 18.74 19.92 17.74 18.69 (2021, 0.6000000000000001) 19.85 21.34 18.73 19.58 (2021, 0.7000000000000001) 21.54 22.66 19.86 21.01 (2021, 0.8) 22.56 23.89 21.36 21.97 (2021, 0.9) 24.22 26.56 22.55 24.07 (2021, 1.0) 35.16 37.51 29.24 37.21 (2022, 0.0) 16.57 17.81 16.34 16.60 (2022, 0.1) 20.61 21.32 19.78 20.31 (2022, 0.2) 22.06 22.86 21.21 21.67 (2022, 0.30000000000000004) 23.17 24.03 22.06 22.79 (2022, 0.4) 24.27 25.33 23.02 23.93 (2022, 0.5) 25.54 26.62 24.38 25.47 (2022, 0.6000000000000001) 26.88 28.07 25.61 26.30 (2022, 0.7000000000000001) 28.42 29.82 26.93 28.11 (2022, 0.8) 30.27 32.01 28.56 29.92 (2022, 0.9) 31.90 33.87 30.06 31.62 (2022, 1.0) 37.50 38.94 33.11 36.45 (2023, 0.0) 11.96 12.46 11.81 12.07 (2023, 0.1) 13.28 13.75 12.87 13.08 (2023, 0.2) 14.01 14.30 13.50 13.73 (2023, 0.30000000000000004) 14.49 14.94 13.94 14.30 (2023, 0.4) 15.77 16.60 14.93 15.66 (2023, 0.5) 16.96 17.79 16.35 16.94 (2023, 0.6000000000000001) 17.87 18.79 17.07 17.70 (2023, 0.7000000000000001) 19.06 19.81 18.18 18.71 (2023, 0.8) 19.74 20.60 18.89 19.59 (2023, 0.9) 21.56 21.98 20.23 20.96 (2023, 1.0) 27.77 30.81 24.00 26.52 (2024, 0.0) 11.53 12.23 10.62 11.86 (2024, 0.1) 12.79 13.13 12.36 12.63 (2024, 0.2) 13.20 13.56 12.84 13.06 (2024, 0.30000000000000004) 13.68 14.10 13.26 13.45 (2024, 0.4) 14.17 14.62 13.61 14.03 (2024, 0.5) 14.90 15.41 14.13 14.63 (2024, 0.6000000000000001) 15.47 16.47 14.86 15.38 (2024, 0.7000000000000001) 16.33 17.47 15.56 16.31 (2024, 0.8) 17.61 19.30 16.68 18.02 (2024, 0.9) 20.44 21.14 18.91 19.89 (2024, 1.0) 33.71 65.73 24.02 38.57 (2025, 0.0) 14.31 14.69 14.12 14.22 (2025, 0.1) 15.27 15.95 14.79 15.15 (2025, 0.2) 15.95 16.56 15.34 15.81 (2025, 0.30000000000000004) 16.42 17.18 15.94 16.38 (2025, 0.4) 16.94 17.58 16.27 16.74 (2025, 0.5) 17.67 18.76 16.96 17.41 (2025, 0.6000000000000001) 18.40 19.50 17.58 18.22 (2025, 0.7000000000000001) 19.59 21.09 18.58 19.35 (2025, 0.8) 21.23 22.94 19.68 21.61 (2025, 0.9) 24.77 26.59 23.77 24.71 (2025, 1.0) 60.13 60.13 38.58 52.33 And then comparing last year to the current year:\n1 2 3 4 5 6 7 8 9 10 current_year = datetime.now().year last_year = current_year - 1 print(f\u0026#34;Last year: {last_year}\u0026#34;) vix_deciles_last_year = vix_deciles_by_year.loc[last_year] display(vix_deciles_last_year) print(f\u0026#34;Current year: {current_year}\u0026#34;) vix_deciles_current_year = vix_deciles_by_year.loc[current_year] display(vix_deciles_current_year) Year: 2024\nOpen High Low Close 0.00 11.53 12.23 10.62 11.86 0.10 12.79 13.13 12.36 12.63 0.20 13.20 13.56 12.84 13.06 0.30 13.68 14.10 13.26 13.45 0.40 14.17 14.62 13.61 14.03 0.50 14.90 15.41 14.13 14.63 0.60 15.47 16.47 14.86 15.38 0.70 16.33 17.47 15.56 16.31 0.80 17.61 19.30 16.68 18.02 0.90 20.44 21.14 18.91 19.89 1.00 33.71 65.73 24.02 38.57 Year: 2025\nOpen High Low Close 0.00 14.31 14.69 14.12 14.22 0.10 15.27 15.95 14.79 15.15 0.20 15.95 16.56 15.34 15.81 0.30 16.42 17.18 15.94 16.38 0.40 16.94 17.58 16.27 16.74 0.50 17.67 18.76 16.96 17.41 0.60 18.40 19.50 17.58 18.22 0.70 19.59 21.09 18.58 19.35 0.80 21.23 22.94 19.68 21.61 0.90 24.77 26.59 23.77 24.71 1.00 60.13 60.13 38.58 52.33 Plots - VIX Histogram Distribution - VIX A quick histogram gives us the distribution for the entire dataset, along with the levels for the mean minus 1 standard deviation, mean, mean plus 1 standard deviation, mean plus 2 standard deviations, mean plus 3 standard deviations, and mean plus 4 standard deviations:\nHistorical Data - VIX Here\u0026rsquo;s two plots for the dataset. The first covers 1990 - 2009, and the second 2010 - Present. This is the daily high level:\nFrom these plots, we can see the following:\nThe VIX has really only jumped above 50 several times (GFC, COVID, recently in August of 2024) The highest levels (\u0026gt; 80) occured only during the GFC \u0026amp; COVID Interestingly, the VIX did not ever get above 50 during the .com bubble Stats By Year - VIX Here\u0026rsquo;s the plot for the mean OHLC values for the VIX by year:\nStats By Month - VIX Here\u0026rsquo;s the plot for the mean OHLC values for the VIX by month:\nData Overview (VVIX) Before moving on to generating a signal, let\u0026rsquo;s run the above data overview code again, but this time for the CBOE VVIX. From the CBOE VVIX website:\n\u0026ldquo;Volatility is often called a new asset class, and every asset class deserves its own volatility index. The Cboe VVIX IndexSM represents the expected volatility of the VIX®. VVIX derives the expected 30-day volatility of VIX by applying the VIX algorithm to VIX options.\u0026rdquo;\nLooking at the statistics of the VVIX should give us an idea of the volatility of the VIX.\nAcquire CBOE VVIX Data First, let\u0026rsquo;s get the data:\n1 2 3 4 5 6 7 8 9 yf_pull_data( base_directory=DATA_DIR, ticker=\u0026#34;^VVIX\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Indices\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) Load Data - VVIX Now that we have the data, let\u0026rsquo;s load it up and take a look:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 # Set decimal places pandas_set_decimal_places(2) # VVIX vvix = load_data( base_directory=DATA_DIR, ticker=\u0026#34;^VVIX\u0026#34;, source=\u0026#34;Yahoo_Finance\u0026#34;, asset_class=\u0026#34;Indices\u0026#34;, timeframe=\u0026#34;Daily\u0026#34;, ) # Set \u0026#39;Date\u0026#39; column as datetime vvix[\u0026#39;Date\u0026#39;] = pd.to_datetime(vvix[\u0026#39;Date\u0026#39;]) # Drop \u0026#39;Volume\u0026#39; vvix.drop(columns = {\u0026#39;Volume\u0026#39;}, inplace = True) # Set Date as index vvix.set_index(\u0026#39;Date\u0026#39;, inplace = True) # Check to see if there are any NaN values vvix[vvix[\u0026#39;High\u0026#39;].isna()] # Forward fill to clean up missing data vvix[\u0026#39;High\u0026#39;] = vvix[\u0026#39;High\u0026#39;].ffill() DataFrame Info - VVIX Now, running:\n1 df_info(vvix) Gives us the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 9037 entries, 1990-01-02 to 2025-11-17 Data columns (total 4 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Close 9037 non-null float64 1 High 9037 non-null float64 2 Low 9037 non-null float64 3 Open 9037 non-null float64 dtypes: float64(4) memory usage: 353.0 KB The first 5 rows are:\nDate Close High Low Open 1990-01-02 00:00:00 17.24 17.24 17.24 17.24 1990-01-03 00:00:00 18.19 18.19 18.19 18.19 1990-01-04 00:00:00 19.22 19.22 19.22 19.22 1990-01-05 00:00:00 20.11 20.11 20.11 20.11 1990-01-08 00:00:00 20.26 20.26 20.26 20.26 The last 5 rows are:\nDate Close High Low Open 2025-11-11 00:00:00 17.28 18.01 17.25 17.90 2025-11-12 00:00:00 17.51 18.06 17.10 17.21 2025-11-13 00:00:00 20.00 21.31 17.51 17.61 2025-11-14 00:00:00 19.83 23.03 19.56 21.33 2025-11-17 00:00:00 22.38 23.44 19.54 19.58 Statistics - VVIX Here are the statistics for the VVIX, generated in the same manner as above for the VIX:\n1 2 3 4 5 6 7 8 9 10 vvix_stats = vvix.describe() num_std = [-1, 0, 1, 2, 3, 4, 5] for num in num_std: vvix_stats.loc[f\u0026#34;mean + {num} std\u0026#34;] = { \u0026#39;Open\u0026#39;: vvix_stats.loc[\u0026#39;mean\u0026#39;][\u0026#39;Open\u0026#39;] + num * vvix_stats.loc[\u0026#39;std\u0026#39;][\u0026#39;Open\u0026#39;], \u0026#39;High\u0026#39;: vvix_stats.loc[\u0026#39;mean\u0026#39;][\u0026#39;High\u0026#39;] + num * vvix_stats.loc[\u0026#39;std\u0026#39;][\u0026#39;High\u0026#39;], \u0026#39;Low\u0026#39;: vvix_stats.loc[\u0026#39;mean\u0026#39;][\u0026#39;Low\u0026#39;] + num * vvix_stats.loc[\u0026#39;std\u0026#39;][\u0026#39;Low\u0026#39;], \u0026#39;Close\u0026#39;: vvix_stats.loc[\u0026#39;mean\u0026#39;][\u0026#39;Close\u0026#39;] + num * vvix_stats.loc[\u0026#39;std\u0026#39;][\u0026#39;Close\u0026#39;], } display(vvix_stats) Gives us:\nClose High Low Open count 4741.00 4741.00 4741.00 4741.00 mean 93.63 95.71 92.05 93.87 std 16.30 17.93 14.96 16.36 min 59.74 59.74 59.31 59.31 25% 82.52 83.72 81.64 82.74 50% 90.84 92.58 89.64 91.20 75% 102.21 105.07 99.88 102.58 max 207.59 212.22 187.27 212.22 mean + -1 std 77.32 77.78 77.09 77.52 mean + 0 std 93.63 95.71 92.05 93.87 mean + 1 std 109.93 113.64 107.00 110.23 mean + 2 std 126.23 131.57 121.96 126.59 mean + 3 std 142.54 149.50 136.92 142.94 mean + 4 std 158.84 167.43 151.88 159.30 mean + 5 std 175.15 185.35 166.83 175.66 We can also run the statistics individually for each year:\n1 2 3 4 5 6 7 8 # Group by year and calculate mean and std for OHLC vvix_stats_by_year = vvix.groupby(vvix.index.year)[[\u0026#34;Open\u0026#34;, \u0026#34;High\u0026#34;, \u0026#34;Low\u0026#34;, \u0026#34;Close\u0026#34;]].agg([\u0026#34;mean\u0026#34;, \u0026#34;std\u0026#34;]) # Flatten the column MultiIndex vvix_stats_by_year.columns = [\u0026#39;_\u0026#39;.join(col).strip() for col in vvix_stats_by_year.columns.values] vvix_stats_by_year.index.name = \u0026#34;Year\u0026#34; display(vvix_stats_by_year) Gives us:\nYear Open_mean Open_std Open_min Open_max High_mean High_std High_min High_max Low_mean Low_std Low_min Low_max Close_mean Close_std Close_min Close_max 2007 87.68 13.31 63.52 142.99 87.68 13.31 63.52 142.99 87.68 13.31 63.52 142.99 87.68 13.31 63.52 142.99 2008 81.85 15.60 59.74 134.87 81.85 15.60 59.74 134.87 81.85 15.60 59.74 134.87 81.85 15.60 59.74 134.87 2009 79.78 8.63 64.95 104.02 79.78 8.63 64.95 104.02 79.78 8.63 64.95 104.02 79.78 8.63 64.95 104.02 2010 88.36 13.07 64.87 145.12 88.36 13.07 64.87 145.12 88.36 13.07 64.87 145.12 88.36 13.07 64.87 145.12 2011 92.94 10.21 75.94 134.63 92.94 10.21 75.94 134.63 92.94 10.21 75.94 134.63 92.94 10.21 75.94 134.63 2012 94.84 8.38 78.42 117.44 94.84 8.38 78.42 117.44 94.84 8.38 78.42 117.44 94.84 8.38 78.42 117.44 2013 80.52 8.97 62.71 111.43 80.52 8.97 62.71 111.43 80.52 8.97 62.71 111.43 80.52 8.97 62.71 111.43 2014 83.01 14.33 61.76 138.60 83.01 14.33 61.76 138.60 83.01 14.33 61.76 138.60 83.01 14.33 61.76 138.60 2015 95.44 15.59 73.07 212.22 98.47 16.39 76.41 212.22 92.15 13.35 72.20 148.68 94.82 14.75 73.18 168.75 2016 93.36 10.02 77.96 131.95 95.82 10.86 78.86 132.42 90.54 8.99 76.17 115.15 92.80 10.07 76.17 125.13 2017 90.50 8.65 75.09 134.98 92.94 9.64 77.34 135.32 87.85 7.78 71.75 117.29 90.01 8.80 75.64 135.32 2018 102.60 13.22 83.70 176.72 106.27 16.26 85.00 203.73 99.17 11.31 82.60 165.35 102.26 14.04 83.21 180.61 2019 91.28 8.43 75.58 112.75 93.61 8.98 75.95 117.63 88.90 7.86 74.36 111.48 91.03 8.36 74.98 114.40 2020 118.64 19.32 88.39 203.03 121.91 20.88 88.54 209.76 115.05 17.37 85.31 187.27 118.36 19.39 86.87 207.59 2021 115.51 9.37 96.09 151.35 119.29 11.70 98.36 168.78 111.99 8.14 95.92 144.19 115.32 10.20 97.09 157.69 2022 102.58 18.01 76.48 161.09 105.32 19.16 77.93 172.82 99.17 16.81 76.13 153.26 101.81 17.81 77.05 154.38 2023 90.95 8.64 74.43 127.73 93.72 9.98 75.31 137.65 88.01 7.37 72.27 119.64 90.34 8.38 73.88 124.75 2024 92.88 15.06 59.31 169.68 97.32 18.33 74.79 192.49 89.51 13.16 59.31 137.05 92.81 15.60 73.26 173.32 2025 102.53 12.92 83.19 186.33 106.86 15.51 85.82 189.03 98.92 10.03 81.73 146.51 101.96 12.31 81.89 170.92 And finally, we can run the statistics individually for each month:\n1 2 3 4 5 6 7 8 # Group by month and calculate mean and std for OHLC vvix_stats_by_month = vvix.groupby(vvix.index.month)[[\u0026#34;Open\u0026#34;, \u0026#34;High\u0026#34;, \u0026#34;Low\u0026#34;, \u0026#34;Close\u0026#34;]].agg([\u0026#34;mean\u0026#34;, \u0026#34;std\u0026#34;]) # Flatten the column MultiIndex vvix_stats_by_month.columns = [\u0026#39;_\u0026#39;.join(col).strip() for col in vvix_stats_by_month.columns.values] vvix_stats_by_month.index.name = \u0026#34;Year\u0026#34; display(vvix_stats_by_month) Gives us:\nYear Open_mean Open_std Open_min Open_max High_mean High_std High_min High_max Low_mean Low_std Low_min Low_max Close_mean Close_std Close_min Close_max 1 92.46 15.63 64.87 161.09 94.37 17.63 64.87 172.82 90.69 14.23 64.87 153.26 92.23 15.78 64.87 157.69 2 93.49 18.24 65.47 176.72 95.39 20.70 65.47 203.73 91.39 16.43 65.47 165.35 93.13 18.58 65.47 180.61 3 95.30 21.66 66.97 203.03 97.38 23.56 66.97 209.76 92.94 19.51 66.97 187.27 94.89 21.59 66.97 207.59 4 92.18 19.03 59.74 186.33 94.01 20.57 59.74 189.03 90.30 17.21 59.74 152.01 91.88 18.60 59.74 170.92 5 92.25 16.93 61.76 145.18 93.95 17.99 61.76 151.50 90.54 16.14 61.76 145.12 91.79 16.79 61.76 146.28 6 93.16 14.86 63.52 155.48 94.76 16.11 63.52 172.21 91.49 13.79 63.52 140.15 92.98 14.83 63.52 151.60 7 90.10 12.82 67.21 138.42 91.63 13.88 67.21 149.60 88.60 11.94 67.21 133.82 89.98 12.78 67.21 139.54 8 96.84 16.53 68.05 212.22 98.99 18.33 68.05 212.22 94.67 14.50 68.05 148.68 96.61 16.24 68.05 173.32 9 94.91 13.70 67.94 135.17 96.84 15.36 67.94 147.14 93.04 12.20 67.94 128.46 94.58 13.44 67.94 138.93 10 98.05 13.86 64.97 149.60 99.88 15.05 64.97 154.99 96.36 13.11 64.97 144.55 97.87 14.02 64.97 152.01 11 93.89 14.14 63.77 142.68 95.48 15.36 63.77 161.76 92.28 13.32 63.77 140.44 93.62 14.20 63.77 149.74 12 93.35 15.03 59.31 151.35 95.33 16.63 62.71 168.37 91.78 13.70 59.31 144.19 93.46 15.07 62.71 156.10 Deciles - VVIX Here are the levels for each decile, for the full dataset:\n1 2 vvix_deciles = vvix.quantile(np.arange(0, 1.1, 0.1)) display(vvix_deciles) Gives us:\nClose High Low Open 0.00 59.74 59.74 59.31 59.31 0.10 76.01 76.25 75.56 76.02 0.20 80.76 81.58 79.99 80.91 0.30 84.10 85.45 83.20 84.43 0.40 87.46 88.88 86.29 87.74 0.50 90.84 92.58 89.64 91.20 0.60 94.44 96.60 93.26 94.74 0.70 99.17 101.68 97.47 99.47 0.80 105.89 109.38 103.73 106.45 0.90 115.10 118.67 112.33 115.27 1.00 207.59 212.22 187.27 212.22 Plots - VVIX Histogram Distribution - VVIX A quick histogram gives us the distribution for the entire dataset, along with the levels for the mean minus 1 standard deviation, mean, mean plus 1 standard deviation, mean plus 2 standard deviations, mean plus 3 standard deviations, and mean plus 4 standard deviations:\nHistorical Data - VVIX Here\u0026rsquo;s two plots for the dataset. The first covers 2007 - 2016, and the second 2017 - Present. This is the daily high level:\nStats By Year - VVIX Here\u0026rsquo;s the plot for the mean OHLC values for the VVIX by year:\nStats By Month - VVIX Here\u0026rsquo;s the plot for the mean OHLC values for the VVIX by month:\nReferences https://www.cboe.com/tradable_products/vix/ https://github.com/ranaroussi/yfinance Code The jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2025-03-01T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2025/03/01/investigating-a-vix-trading-signal-part-1-vix-and-vvix/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2025/03/01/investigating-a-vix-trading-signal-part-1-vix-and-vvix/","title":"Investigating A VIX Trading Signal, Part 1: VIX And VVIX"},{"content":"Introduction This post intends to provide the code for all of the python functions that I use in my analysis. The goal here is that when writing another post I will simply be able to link to the functions below as opposed to providing the function code in each post.\nFunction Index bb_clean_data: Takes an Excel export from Bloomberg, removes the miscellaneous headings/rows, and returns a DataFrame. build_index: Reads the index_temp.md markdown file, inserts the markdown dependencies where indicated, and then saves the file as index.md. calc_vix_trade_pnl: Calculates the profit/loss from VIX options trades. coinbase_fetch_available_products: Fetch available products from Coinbase Exchange API. coinbase_fetch_full_history: Fetch full historical data for a given product from Coinbase Exchange API. coinbase_fetch_historical_candles: Fetch historical candle data for a given product from Coinbase Exchange API. coinbase_pull_data: Update existing record or pull full historical data for a given product from Coinbase Exchange API. df_info: A simple function to display the information about a DataFrame and the first five rows and last five rows. df_info_markdown: Similar to the df_info function above, except that it coverts the output to markdown. export_track_md_deps: Exports various text outputs to markdown files, which are included in the index.md file created when building the site with Hugo. load_data: Load data from a CSV, Excel, or Pickle file into a pandas DataFrame. pandas_set_decimal_places: Set the number of decimal places displayed for floating-point numbers in pandas. plot_timeseries: Plot the price data from a DataFrame for a specified date range and columns. plot_stats: Generate a scatter plot for the mean OHLC prices. plot_vix_with_trades: Plot the VIX daily high and low prices, along with the VIX spikes, and trades. polygon_fetch_full_history: Fetch full historical data for a given product from Polygon API. polygon_pull_data: Read existing data file, download price data from Polygon, and export data. strategy_harry_brown_perm_port: Execute the strategy for the Harry Brown permanent portfolio. summary_stats: Generate summary statistics for a series of returns. yf_pull_data: Download daily price data from Yahoo Finance and export it. Python Functions bb_clean_data 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 import os import pandas as pd from IPython.display import display def bb_clean_data( base_directory: str, fund_ticker_name: str, source: str, asset_class: str, excel_export: bool, pickle_export: bool, output_confirmation: bool, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; This function takes an excel export from Bloomberg and removes all excess data leaving date and close columns. Parameters: ----------- base_directory : str Root path to store downloaded data. fund : str The fund to clean the data from. source : str Name of the data source (e.g., \u0026#39;Bloomberg\u0026#39;). asset_class : str Asset class name (e.g., \u0026#39;Equities\u0026#39;). excel_export : bool If True, export data to Excel format. pickle_export : bool If True, export data to Pickle format. output_confirmation : bool If True, print confirmation message. Returns: -------- df : pd.DataFrame DataFrame containing cleaned data prices. \u0026#34;\u0026#34;\u0026#34; # Set location from where to read existing excel file location = f\u0026#34;{base_directory}/{source}/{asset_class}/Daily/{fund_ticker_name}.xlsx\u0026#34; # Read data from excel try: df = pd.read_excel(location, sheet_name =\u0026#34;Worksheet\u0026#34;, engine=\u0026#34;calamine\u0026#34;) except FileNotFoundError: print(f\u0026#34;File not found...please download the data for {fund_ticker_name}\u0026#34;) # Set the column headings from row 5 (which is physically row 6) df.columns = df.iloc[5] # Set the column heading for the index to be \u0026#34;None\u0026#34; df.rename_axis(None, axis=1, inplace = True) # Drop the first 6 rows, 0 - 5 df.drop(df.index[0:6], inplace=True) # Set the date column as the index df.set_index(\u0026#39;Date\u0026#39;, inplace = True) # Drop the volume column try: df.drop(columns = {\u0026#39;PX_VOLUME\u0026#39;}, inplace = True) except KeyError: pass # Rename column df.rename(columns = {\u0026#39;PX_LAST\u0026#39;:\u0026#39;Close\u0026#39;}, inplace = True) # Sort by date df.sort_values(by=[\u0026#39;Date\u0026#39;], inplace = True) # Create directory directory = f\u0026#34;{base_directory}/{source}/{asset_class}/Daily\u0026#34; os.makedirs(directory, exist_ok=True) # Export to excel if excel_export == True: df.to_excel(f\u0026#34;{directory}/{fund_ticker_name}_Clean.xlsx\u0026#34;, sheet_name=\u0026#34;data\u0026#34;) else: pass # Export to pickle if pickle_export == True: df.to_pickle(f\u0026#34;{directory}/{fund_ticker_name}_Clean.pkl\u0026#34;) else: pass # Output confirmation if output_confirmation == True: print(f\u0026#34;The first and last date of data for {fund_ticker_name} is: \u0026#34;) display(df[:1]) display(df[-1:]) print(f\u0026#34;Bloomberg data cleaning complete for {fund_ticker_name}\u0026#34;) print(f\u0026#34;--------------------\u0026#34;) else: pass return df build_index 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 from pathlib import Path def build_index() -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34; Build a Hugo-compatible index.md by combining Markdown fragments. This function reads a template file (`index_temp.md`) and a list of markdown dependencies from `index_dep.txt`. For each entry in the dependency list, it replaces a corresponding placeholder in the template (formatted as \u0026lt;!-- INSERT_\u0026lt;name\u0026gt;_HERE --\u0026gt;) with the content from the markdown file. If a file is missing, the placeholder is replaced with a warning note. Output: ------- - Writes the final assembled content to `index.md`. Raises: ------- FileNotFoundError: If either `index_temp.md` or `index_dep.txt` does not exist. Example: -------- If `index_dep.txt` contains: 01_intro.md 02_analysis.md And `index_temp.md` contains: \u0026lt;!-- INSERT_01_intro_HERE --\u0026gt; \u0026lt;!-- INSERT_02_analysis_HERE --\u0026gt; The resulting `index.md` will include the contents of the respective markdown files in place of their placeholders. \u0026#34;\u0026#34;\u0026#34; temp_index_path = Path(\u0026#34;index_temp.md\u0026#34;) final_index_path = Path(\u0026#34;index.md\u0026#34;) dependencies_path = Path(\u0026#34;index_dep.txt\u0026#34;) # Read the index template if not temp_index_path.exists(): raise FileNotFoundError(\u0026#34;Missing index_temp.md\u0026#34;) temp_index_content = temp_index_path.read_text() # Read dependency list if not dependencies_path.exists(): raise FileNotFoundError(\u0026#34;Missing index_dep.txt\u0026#34;) with dependencies_path.open(\u0026#34;r\u0026#34;) as f: markdown_files = [line.strip() for line in f if line.strip()] # Replace placeholders for each dependency final_index_content = temp_index_content for md_file in markdown_files: placeholder = f\u0026#34;\u0026lt;!-- INSERT_{Path(md_file).stem}_HERE --\u0026gt;\u0026#34; if Path(md_file).exists(): content = Path(md_file).read_text() final_index_content = final_index_content.replace(placeholder, content) else: print(f\u0026#34;⚠️ Warning: {md_file} not found, skipping placeholder {placeholder}\u0026#34;) final_index_content = final_index_content.replace(placeholder, f\u0026#34;*{md_file} not found*\u0026#34;) # Write final index.md final_index_path.write_text(final_index_content) print(\u0026#34;✅ index.md successfully built!\u0026#34;) if __name__ == \u0026#34;__main__\u0026#34;: build_index() calc_vix_trade_pnl 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 import pandas as pd def calc_vix_trade_pnl( transaction_df: pd.DataFrame, exp_start_date: str, exp_end_date: str, trade_start_date: str, trade_end_date: str, ) -\u0026gt; tuple[pd.DataFrame, pd.DataFrame, pd.DataFrame, str, str, str, str]: \u0026#34;\u0026#34;\u0026#34; Calculate the profit and loss (PnL) of trades based on transaction data. Parameters: ----------- transaction_df : pd.DataFrame DataFrame containing transaction data. exp_start_date : str Start date for filtering transactions in \u0026#39;YYYY-MM-DD\u0026#39; format. This is the start of the range for the option expiration date. exp_end_date : str End date for filtering transactions in \u0026#39;YYYY-MM-DD\u0026#39; format. This is the end of the range for the option expiration date. trade_start_date : str Start date for filtering transactions in \u0026#39;YYYY-MM-DD\u0026#39; format. This is the start of the range for the trade date. trade_end_date : str End date for filtering transactions in \u0026#39;YYYY-MM-DD\u0026#39; format. This is the end of the range for the trade date. Returns: -------- transactions_data : pd.DataFrame Dataframe containing the transactions for the specified timeframe. closed_trades : pd.DataFrame DataFrame containing the closed trades with realized PnL and percent PnL. open_trades : pd.DataFrame DataFrame containing the open trades. net_PnL_percent_str : str String representation of the net profit percentage. net_PnL_str : str String representation of the net profit and loss in dollars. \u0026#34;\u0026#34;\u0026#34; # If start and end dates for trades and expirations are None, use the entire DataFrame if exp_start_date is None and exp_end_date is None and trade_start_date is None and trade_end_date is None: transactions_data = transaction_df # If both start and end dates for trades and expirations are provided then filter by both else: transactions_data = transaction_df[ (transaction_df[\u0026#39;Exp_Date\u0026#39;] \u0026gt;= exp_start_date) \u0026amp; (transaction_df[\u0026#39;Exp_Date\u0026#39;] \u0026lt;= exp_end_date) \u0026amp; (transaction_df[\u0026#39;Trade_Date\u0026#39;] \u0026gt;= trade_start_date) \u0026amp; (transaction_df[\u0026#39;Trade_Date\u0026#39;] \u0026lt;= trade_end_date) ] # Combine the \u0026#39;Action\u0026#39; and \u0026#39;Symbol\u0026#39; columns to create a unique identifier for each transaction transactions_data[\u0026#39;TradeDate_Action_Symbol_VIX\u0026#39;] = ( transactions_data[\u0026#39;Trade_Date\u0026#39;].astype(str) + \u0026#34;, \u0026#34; + transactions_data[\u0026#39;Action\u0026#39;] + \u0026#34;, \u0026#34; + transactions_data[\u0026#39;Symbol\u0026#39;] + \u0026#34;, VIX = \u0026#34; + transactions_data[\u0026#39;Approx_VIX_Level\u0026#39;].astype(str) ) # Split buys and sells and sum the notional amounts transactions_sells = transactions_data[transactions_data[\u0026#39;Action\u0026#39;] == \u0026#39;Sell to Close\u0026#39;] transactions_sells = transactions_sells.groupby([\u0026#39;Symbol\u0026#39;, \u0026#39;Exp_Date\u0026#39;], as_index=False)[[\u0026#39;Amount\u0026#39;, \u0026#39;Quantity\u0026#39;]].sum() transactions_buys = transactions_data[transactions_data[\u0026#39;Action\u0026#39;] == \u0026#39;Buy to Open\u0026#39;] transactions_buys = transactions_buys.groupby([\u0026#39;Symbol\u0026#39;, \u0026#39;Exp_Date\u0026#39;], as_index=False)[[\u0026#39;Amount\u0026#39;, \u0026#39;Quantity\u0026#39;]].sum() # Merge buys and sells dataframes back together merged_transactions = pd.merge(transactions_buys, transactions_sells, on=[\u0026#39;Symbol\u0026#39;, \u0026#39;Exp_Date\u0026#39;], how=\u0026#39;outer\u0026#39;, suffixes=(\u0026#39;_Buy\u0026#39;, \u0026#39;_Sell\u0026#39;)) merged_transactions = merged_transactions.sort_values(by=[\u0026#39;Exp_Date\u0026#39;], ascending=[True]) merged_transactions = merged_transactions.reset_index(drop=True) # Identify the closed positions merged_transactions[\u0026#39;Closed\u0026#39;] = (~merged_transactions[\u0026#39;Amount_Sell\u0026#39;].isna()) \u0026amp; (~merged_transactions[\u0026#39;Amount_Buy\u0026#39;].isna()) \u0026amp; (merged_transactions[\u0026#39;Quantity_Buy\u0026#39;] == merged_transactions[\u0026#39;Quantity_Sell\u0026#39;]) # Create a new dataframe for closed positions closed_trades = merged_transactions[merged_transactions[\u0026#39;Closed\u0026#39;]] closed_trades = closed_trades.reset_index(drop=True) closed_trades[\u0026#39;Realized_PnL\u0026#39;] = closed_trades[\u0026#39;Amount_Sell\u0026#39;] - closed_trades[\u0026#39;Amount_Buy\u0026#39;] closed_trades[\u0026#39;Percent_PnL\u0026#39;] = closed_trades[\u0026#39;Realized_PnL\u0026#39;] / closed_trades[\u0026#39;Amount_Buy\u0026#39;] closed_trades.drop(columns={\u0026#39;Closed\u0026#39;, \u0026#39;Exp_Date\u0026#39;}, inplace=True) closed_trades[\u0026#39;Quantity_Sell\u0026#39;] = closed_trades[\u0026#39;Quantity_Sell\u0026#39;].astype(int) # Calculate the net % PnL net_PnL_percent = closed_trades[\u0026#39;Realized_PnL\u0026#39;].sum() / closed_trades[\u0026#39;Amount_Buy\u0026#39;].sum() net_PnL_percent_str = f\u0026#34;{round(net_PnL_percent * 100, 2)}%\u0026#34; # Calculate the net $ PnL net_PnL = closed_trades[\u0026#39;Realized_PnL\u0026#39;].sum() net_PnL_str = f\u0026#34;${net_PnL:,.2f}\u0026#34; # Create a new dataframe for open positions open_trades = merged_transactions[~merged_transactions[\u0026#39;Closed\u0026#39;]] open_trades = open_trades.reset_index(drop=True) open_trades.drop(columns={\u0026#39;Closed\u0026#39;, \u0026#39;Amount_Sell\u0026#39;, \u0026#39;Quantity_Sell\u0026#39;, \u0026#39;Exp_Date\u0026#39;}, inplace=True) # Calculate the total market value of opened positions # If start and end dates for trades and expirations are None, use only the closed positions if exp_start_date is None and exp_end_date is None and trade_start_date is None and trade_end_date is None: total_opened_pos_mkt_val = closed_trades[\u0026#39;Amount_Buy\u0026#39;].sum() else: total_opened_pos_mkt_val = closed_trades[\u0026#39;Amount_Buy\u0026#39;].sum() + open_trades[\u0026#39;Amount_Buy\u0026#39;].sum() total_opened_pos_mkt_val_str = f\u0026#34;${total_opened_pos_mkt_val:,.2f}\u0026#34; # Calculate the total market value of closed positions total_closed_pos_mkt_val = closed_trades[\u0026#39;Amount_Sell\u0026#39;].sum() total_closed_pos_mkt_val_str = f\u0026#34;${total_closed_pos_mkt_val:,.2f}\u0026#34; return transactions_data, closed_trades, open_trades, net_PnL_percent_str, net_PnL_str, total_opened_pos_mkt_val_str, total_closed_pos_mkt_val_str coinbase_fetch_available_products 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 import pandas as pd import requests def coinbase_fetch_available_products( base_currency: str, quote_currency: str, status: str, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Fetch available products from Coinbase Exchange API. Parameters: ----------- base_currency : str, optional Filter products by base currency (e.g., \u0026#39;BTC\u0026#39;). quote_currency : str, optional Filter products by quote currency (e.g., \u0026#39;USD\u0026#39;). status : str, optional Filter products by status (e.g., \u0026#39;online\u0026#39;, \u0026#39;offline\u0026#39;). Returns: -------- pd.DataFrame DataFrame containing available products with their details. \u0026#34;\u0026#34;\u0026#34; url = \u0026#39;https://api.exchange.coinbase.com/products\u0026#39; try: response = requests.get(url, timeout=10) response.raise_for_status() products = response.json() # Convert the list of products into a pandas DataFrame df = pd.DataFrame(products) # Filter by base_currency if provided if base_currency: df = df[df[\u0026#39;base_currency\u0026#39;] == base_currency] # Filter by quote_currency if provided if quote_currency: df = df[df[\u0026#39;quote_currency\u0026#39;] == quote_currency] # Filter by status if provided if status: df = df[df[\u0026#39;status\u0026#39;] == status] # Sort by \u0026#34;id\u0026#34; df = df.sort_values(by=\u0026#39;id\u0026#39;) return df except requests.exceptions.HTTPError as errh: print(f\u0026#34;HTTP Error: {errh}\u0026#34;) except requests.exceptions.ConnectionError as errc: print(f\u0026#34;Error Connecting: {errc}\u0026#34;) except requests.exceptions.Timeout as errt: print(f\u0026#34;Timeout Error: {errt}\u0026#34;) except requests.exceptions.RequestException as err: print(f\u0026#34;Oops: Something Else {err}\u0026#34;) if __name__ == \u0026#34;__main__\u0026#34;: # Example usage df = coinbase_fetch_available_products( base_currency=None, quote_currency=\u0026#34;USD\u0026#34;, status=\u0026#34;online\u0026#34;, ) if df is not None: print(df) else: print(\u0026#34;No data returned.\u0026#34;) coinbase_fetch_full_history 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 import pandas as pd import time from coinbase_fetch_historical_candles import coinbase_fetch_historical_candles from datetime import datetime, timedelta def coinbase_fetch_full_history( product_id: str, start: datetime, end: datetime, granularity: int, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Fetch full historical data for a given product from Coinbase Exchange API. Parameters: ----------- product_id : str The trading pair (e.g., \u0026#39;BTC-USD\u0026#39;). start : datetime Start time in UTC. end : datetime End time in UTC. granularity : int Time slice in seconds (e.g., 3600 for hourly candles). Returns: -------- pd.DataFrame DataFrame containing time, low, high, open, close, volume. \u0026#34;\u0026#34;\u0026#34; all_data = [] current_start = start while current_start \u0026lt; end: current_end = min(current_start + timedelta(seconds=granularity * 300), end) # Fetch max 300 candles per request df = coinbase_fetch_historical_candles(product_id, current_start, current_end, granularity) if df.empty: break all_data.append(df) current_start = df[\u0026#39;time\u0026#39;].iloc[-1] + timedelta(seconds=granularity) time.sleep(0.2) # Small delay to respect rate limits if all_data: full_df = pd.concat(all_data).reset_index(drop=True) return full_df else: return pd.DataFrame() if __name__ == \u0026#34;__main__\u0026#34;: # Example usage df = coinbase_fetch_full_history( product_id=\u0026#34;BTC-USD\u0026#34;, start=datetime(2025, 1, 1), end=datetime(2025, 1, 31), granularity=86_400, ) if df is not None: print(df) else: print(\u0026#34;No data returned.\u0026#34;) coinbase_fetch_historical_candles 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 import pandas as pd import requests import time from datetime import datetime def coinbase_fetch_historical_candles( product_id: str, start: datetime, end: datetime, granularity: int, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Fetch historical candle data for a given product from Coinbase Exchange API. Parameters: ----------- product_id : str The trading pair (e.g., \u0026#39;BTC-USD\u0026#39;). start : str Start time in UTC. end : str End time in UTC. granularity : int Time slice in seconds (e.g., 60 for minute candles, 3600 for hourly candles, 86,400 for daily candles). Returns: -------- pd.DataFrame DataFrame containing time, low, high, open, close, volume. \u0026#34;\u0026#34;\u0026#34; url = f\u0026#39;https://api.exchange.coinbase.com/products/{product_id}/candles\u0026#39; params = { \u0026#39;start\u0026#39;: start.isoformat(), \u0026#39;end\u0026#39;: end.isoformat(), \u0026#39;granularity\u0026#39;: granularity } max_retries = 5 retry_delay = 1 # initial delay in seconds for attempt in range(max_retries): try: response = requests.get(url, params=params, timeout=10) response.raise_for_status() data = response.json() # Coinbase Exchange API returns data in reverse chronological order data = data[::-1] # Convert to DataFrame df = pd.DataFrame(data, columns=[\u0026#39;time\u0026#39;, \u0026#39;low\u0026#39;, \u0026#39;high\u0026#39;, \u0026#39;open\u0026#39;, \u0026#39;close\u0026#39;, \u0026#39;volume\u0026#39;]) df[\u0026#39;time\u0026#39;] = pd.to_datetime(df[\u0026#39;time\u0026#39;], unit=\u0026#39;s\u0026#39;) return df except requests.exceptions.HTTPError as errh: if response.status_code == 429: print(f\u0026#34;Rate limit exceeded. Retrying in {retry_delay} seconds...\u0026#34;) time.sleep(retry_delay) retry_delay *= 2 # Exponential backoff else: print(f\u0026#34;HTTP Error: {errh}\u0026#34;) break except requests.exceptions.ConnectionError as errc: print(f\u0026#34;Error Connecting: {errc}\u0026#34;) time.sleep(retry_delay) retry_delay *= 2 except requests.exceptions.Timeout as errt: print(f\u0026#34;Timeout Error: {errt}\u0026#34;) time.sleep(retry_delay) retry_delay *= 2 except requests.exceptions.RequestException as err: print(f\u0026#34;OOps: Something Else {err}\u0026#34;) break raise Exception(\u0026#34;Failed to fetch data after multiple retries.\u0026#34;) if __name__ == \u0026#34;__main__\u0026#34;: # Example usage df = coinbase_fetch_historical_candles( product_id=\u0026#34;BTC-USD\u0026#34;, start=datetime(2025, 1, 1), end=datetime(2025, 1, 2), granularity=3_600, ) if df is not None: print(df) else: print(\u0026#34;No data returned.\u0026#34;) coinbase_pull_data 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 import calendar import os import pandas as pd from coinbase_fetch_available_products import coinbase_fetch_available_products from coinbase_fetch_full_history import coinbase_fetch_full_history from datetime import datetime, timedelta from settings import config # Get the data directory from the configuration DATA_DIR = config(\u0026#34;DATA_DIR\u0026#34;) def coinbase_pull_data( base_directory, source: str, asset_class: str, excel_export: bool, pickle_export: bool, output_confirmation: bool, base_currency: str, quote_currency: str, granularity: int=3600, # 60=minute, 3600=hourly, 86400=daily status: str=\u0026#39;online\u0026#39;, # default status is \u0026#39;online\u0026#39; start_date: datetime=datetime(2025, 1, 1), # default start date end_date: datetime=datetime.now() - timedelta(days=1), # updates data through 1 day ago due to lag in data availability ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Update existing record or pull full historical data for a given product from Coinbase Exchange API. Parameters: ----------- base_directory Root path to store downloaded data. source : str Name of the data source (e.g., \u0026#39;Nasdaq_Data_Link\u0026#39;). asset_class : str Asset class name (e.g., \u0026#39;Equities\u0026#39;). excel_export : bool If True, export data to Excel format. pickle_export : bool If True, export data to Pickle format. output_confirmation : bool If True, print confirmation message. base_currency : str The base currency (e.g., \u0026#39;BTC\u0026#39;). quote_currency : str The quote currency (e.g., \u0026#39;USD\u0026#39;). status : str, optional Filter products by status (default is \u0026#39;online\u0026#39;). granularity : int Time slice in seconds (e.g., 3600 for hourly candles). start_date : str, optional Start date in UTC (ISO format). end_date : str, optional End date in UTC (ISO format). Returns: -------- None \u0026#34;\u0026#34;\u0026#34; # List of crypto assets filtered_products = coinbase_fetch_available_products(base_currency=base_currency, quote_currency=quote_currency, status=status) filtered_products_list = filtered_products[\u0026#39;id\u0026#39;].tolist() filtered_products_list = sorted(filtered_products_list) if not filtered_products.empty: print(filtered_products[[\u0026#39;id\u0026#39;, \u0026#39;base_currency\u0026#39;, \u0026#39;quote_currency\u0026#39;, \u0026#39;status\u0026#39;]]) print(filtered_products_list) print(len(filtered_products_list)) else: print(\u0026#34;No products found with the specified base and/or quote currencies.\u0026#34;) missing_data = [] omitted_data = [] num_products = len(filtered_products_list) counter = 0 # Loop for updates for product in filtered_products_list: counter+=1 print(f\u0026#34;Updating product {counter} of {num_products}.\u0026#34;) if granularity == 60: time_length = \u0026#34;Minute\u0026#34; elif granularity == 3600: time_length = \u0026#34;Hourly\u0026#34; elif granularity == 86400: time_length = \u0026#34;Daily\u0026#34; else: print(\u0026#34;Error - please confirm timeframe.\u0026#34;) break # Set file location based on parameters file_location = f\u0026#34;{base_directory}/{source}/{asset_class}/{time_length}/{product}.pkl\u0026#34; try: # Attempt to read existing pickle data file ex_data = pd.read_pickle(file_location) ex_data = ex_data.reset_index() print(f\u0026#34;File found...updating the {product} data\u0026#34;) print(\u0026#34;Existing data:\u0026#34;) print(ex_data) # Pull recent data new_data = coinbase_fetch_full_history(product, start_date, end_date, granularity) new_data = new_data.rename(columns={\u0026#39;time\u0026#39;:\u0026#39;Date\u0026#39;}) new_data[\u0026#39;Date\u0026#39;] = new_data[\u0026#39;Date\u0026#39;].dt.tz_localize(None) print(\u0026#34;New data:\u0026#34;) print(new_data) # Combine existing data with recent data full_history_df = pd.concat([ex_data,new_data[new_data[\u0026#39;Date\u0026#39;].isin(ex_data[\u0026#39;Date\u0026#39;]) == False]]) full_history_df = full_history_df.sort_values(by=\u0026#39;Date\u0026#39;) full_history_df[\u0026#39;Date\u0026#39;] = full_history_df[\u0026#39;Date\u0026#39;].dt.tz_localize(None) full_history_df = full_history_df.set_index(\u0026#39;Date\u0026#39;) print(\u0026#34;Combined data:\u0026#34;) print(full_history_df) # Create directory directory = f\u0026#34;{base_directory}/{source}/{asset_class}/{time_length}\u0026#34; os.makedirs(directory, exist_ok=True) # Export to excel if excel_export == True: full_history_df.to_excel(f\u0026#34;{directory}/{product}.xlsx\u0026#34;, sheet_name=\u0026#34;data\u0026#34;) else: pass # Export to pickle if pickle_export == True: full_history_df.to_pickle(f\u0026#34;{directory}/{product}.pkl\u0026#34;) else: pass # Output confirmation if output_confirmation == True: print(f\u0026#34;Data update complete for {time_length} {product}.\u0026#34;) print(\u0026#34;--------------------\u0026#34;) else: pass except FileNotFoundError: # Starting year for fetching initial data starting_year = 2025 # Print error print(f\u0026#34;File not found...downloading the {product} data starting with {starting_year}.\u0026#34;) def get_full_hist(year): try: # Define the start and end dates start_date = datetime(year, 1, 1) # Default start date end_date = datetime.now() - timedelta(days = 1) # Updates data through 1 day ago # Fetch and process the data full_history_df = coinbase_fetch_full_history(product, start_date, end_date, granularity) full_history_df = full_history_df.rename(columns={\u0026#39;time\u0026#39;: \u0026#39;Date\u0026#39;}) full_history_df = full_history_df.sort_values(by=\u0026#39;Date\u0026#39;) # Iterate through rows to see if the value of the asset ever exceeds a specified threshold # Default value for the price threshold is 0 USD # If the price never exceeds this threshold, the asset is omitted from the final list def find_first_close_above_threshold(full_history_df, threshold=0): # Ensure \u0026#39;Date\u0026#39; is the index before proceeding if \u0026#39;Date\u0026#39; in full_history_df.columns: full_history_df.set_index(\u0026#39;Date\u0026#39;, inplace=True) full_history_df.index = full_history_df.index.tz_localize(None) # Iterate through the DataFrame for index, row in full_history_df.iterrows(): if row[\u0026#39;close\u0026#39;] \u0026gt;= threshold: print(f\u0026#34;First occurrence: {index}, close={row[\u0026#39;close\u0026#39;]}\u0026#34;) # Return the filtered DataFrame starting from this row return full_history_df.loc[index:] # If no value meets the condition, return an empty DataFrame print(f\u0026#34;Share price never exceeds {threshold} USD.\u0026#34;) omitted_data.append(product) return None full_history_above_threshold_df = find_first_close_above_threshold(full_history_df, threshold=0) return full_history_above_threshold_df except KeyError: print(f\u0026#34;KeyError: No data available for {product} in {year}. Trying next year...\u0026#34;) next_year = year + 1 # Base case: Stop if the next year exceeds the current year if next_year \u0026gt; datetime.now().year: print(\u0026#34;No more data available for any future years.\u0026#34;) missing_data.append(product) return None # Recursive call for the next year return get_full_hist(year=next_year) # Fetch the full history starting from the given year full_history_df = get_full_hist(year=starting_year) if full_history_df is not None: # Create directory directory = f\u0026#34;{base_directory}/{source}/{asset_class}/{time_length}\u0026#34; os.makedirs(directory, exist_ok=True) # Export to excel if excel_export == True: full_history_df.to_excel(f\u0026#34;{directory}/{product}.xlsx\u0026#34;, sheet_name=\u0026#34;data\u0026#34;) else: pass # Export to pickle if pickle_export == True: full_history_df.to_pickle(f\u0026#34;{directory}/{product}.pkl\u0026#34;) else: pass # Output confirmation if output_confirmation == True: print(f\u0026#34;Initial data fetching completed successfully for {time_length} {product}.\u0026#34;) print(\u0026#34;--------------------\u0026#34;) else: pass else: print(\u0026#34;No data could be fetched for the specified range.\u0026#34;) except Exception as e: print(str(e)) # Remove the cryptocurrencies with missing data from the final list missing_data = sorted(missing_data) print(f\u0026#34;Data missing for: {missing_data}\u0026#34;) for asset in missing_data: try: print(f\u0026#34;Removing {asset} from the list because it is missing data.\u0026#34;) filtered_products_list.remove(asset) except ValueError: print(f\u0026#34;{asset} not in list.\u0026#34;) pass # Remove the cryptocurrencies with share prices that never exceed 1 USD from the final list omitted_data = sorted(omitted_data) print(f\u0026#34;Data omitted due to price for: {omitted_data}\u0026#34;) for asset in omitted_data: try: print(f\u0026#34;Removing {asset} from the list because the share price never exceeds 1 USD.\u0026#34;) filtered_products_list.remove(asset) except ValueError: print(f\u0026#34;{asset} not in list.\u0026#34;) pass # Remove stablecoins from the final list stablecoins_to_remove = [\u0026#39;USDT-USD\u0026#39;, \u0026#39;USDC-USD\u0026#39;, \u0026#39;PAX-USD\u0026#39;, \u0026#39;DAI-USD\u0026#39;, \u0026#39;PYUSD-USD\u0026#39;, \u0026#39;GUSD-USD\u0026#39;] stablecoins_to_remove = sorted(stablecoins_to_remove) print(f\u0026#34;Data for stable coins not to be used: {stablecoins_to_remove}\u0026#34;) for asset in stablecoins_to_remove: try: filtered_products_list.remove(asset) # print(f\u0026#34;Removing {asset} from the list because it is a stablecoin.\u0026#34;) except ValueError: # print(f\u0026#34;{asset} not in list.\u0026#34;) pass # Remove the wrapped coins from the final list wrapped_coins_to_remove = [\u0026#39;WAXL-USD\u0026#39;, \u0026#39;WBTC-USD\u0026#39;] wrapped_coins_to_remove = sorted(wrapped_coins_to_remove) print(f\u0026#34;Data for wrapped coins not to be used: {wrapped_coins_to_remove}\u0026#34;) for asset in wrapped_coins_to_remove: try: filtered_products_list.remove(asset) # print(f\u0026#34;Removing {asset} from the list because it is a wrapped coin.\u0026#34;) except ValueError: # print(f\u0026#34;{asset} not in list.\u0026#34;) pass # Print the final list of token and the length of the list print(f\u0026#34;Final list of tokens: {filtered_products_list}\u0026#34;) print(f\u0026#34;Length of final list of tokens: {len(filtered_products_list)}\u0026#34;) return full_history_df if __name__ == \u0026#34;__main__\u0026#34;: # Example usage to pull all data for each month from 2010 to 2024 for granularity in [60, 3600, 86400]: for year in range(2010, 2025): for month in range(1, 13): print(f\u0026#34;Pulling data for {year}-{month:02d}...\u0026#34;) try: # Get the last day of the month last_day = calendar.monthrange(year, month)[1] coinbase_pull_data( base_directory=DATA_DIR, source=\u0026#34;Coinbase\u0026#34;, asset_class=\u0026#34;Cryptocurrencies\u0026#34;, excel_export=False, pickle_export=True, output_confirmation=True, base_currency=\u0026#34;BTC\u0026#34;, quote_currency=\u0026#34;USD\u0026#34;, granularity=granularity, # 60=minute, 3600=hourly, 86400=daily status=\u0026#39;online\u0026#39;, start_date=datetime(year, month, 1), end_date=datetime(year, month, last_day), ) except Exception as e: print(f\u0026#34;Failed to pull data for {year}-{month:02d}: {e}\u0026#34;) # current_year = datetime.now().year # current_month = datetime.now().month # current_day = datetime.now().day # # Crypto Data # currencies = [\u0026#34;BTC\u0026#34;, \u0026#34;ETH\u0026#34;, \u0026#34;SOL\u0026#34;, \u0026#34;XRP\u0026#34;] # # Iterate through each currency # for cur in currencies: # # Example usage - minute # coinbase_pull_data( # base_directory=DATA_DIR, # source=\u0026#34;Coinbase\u0026#34;, # asset_class=\u0026#34;Cryptocurrencies\u0026#34;, # excel_export=False, # pickle_export=True, # output_confirmation=True, # base_currency=cur, # quote_currency=\u0026#34;USD\u0026#34;, # granularity=60, # 60=minute, 3600=hourly, 86400=daily # status=\u0026#39;online\u0026#39;, # default status is \u0026#39;online\u0026#39; # start_date=datetime(current_year, current_month - 1, 1), # default start date # end_date=datetime.now() - timedelta(days=1), # updates data through 1 day ago due to lag in data availability # ) # # Example usage - hourly # coinbase_pull_data( # base_directory=DATA_DIR, # source=\u0026#34;Coinbase\u0026#34;, # asset_class=\u0026#34;Cryptocurrencies\u0026#34;, # excel_export=True, # pickle_export=True, # output_confirmation=True, # base_currency=cur, # quote_currency=\u0026#34;USD\u0026#34;, # granularity=3600, # 60=minute, 3600=hourly, 86400=daily # status=\u0026#39;online\u0026#39;, # default status is \u0026#39;online\u0026#39; # start_date=datetime(current_year, current_month - 1, 1), # default start date # end_date=datetime.now() - timedelta(days=1), # updates data through 1 day ago due to lag in data availability # ) # # Example usage - daily # coinbase_pull_data( # base_directory=DATA_DIR, # source=\u0026#34;Coinbase\u0026#34;, # asset_class=\u0026#34;Cryptocurrencies\u0026#34;, # excel_export=True, # pickle_export=True, # output_confirmation=True, # base_currency=cur, # quote_currency=\u0026#34;USD\u0026#34;, # granularity=86400, # 60=minute, 3600=hourly, 86400=daily # status=\u0026#39;online\u0026#39;, # default status is \u0026#39;online\u0026#39; # start_date=datetime(current_year, current_month - 1, 1), # default start date # end_date=datetime.now() - timedelta(days=1), # updates data through 1 day ago due to lag in data availability # ) df_info 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 import pandas as pd from IPython.display import display def df_info( df: pd.DataFrame, ) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34; Display summary information about a pandas DataFrame. This function prints: - The DataFrame\u0026#39;s column names, shape, and data types via `df.info()` - The first 5 rows using `df.head()` - The last 5 rows using `df.tail()` It uses `display()` for better output formatting in environments like Jupyter notebooks. Parameters: ----------- df : pd.DataFrame The DataFrame to inspect. Returns: -------- None Example: -------- \u0026gt;\u0026gt;\u0026gt; df_info(my_dataframe) \u0026#34;\u0026#34;\u0026#34; print(\u0026#34;The columns, shape, and data types are:\u0026#34;) print(df.info()) print(\u0026#34;The first 5 rows are:\u0026#34;) display(df.head()) print(\u0026#34;The last 5 rows are:\u0026#34;) display(df.tail()) df_info_markdown 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 import io import pandas as pd def df_info_markdown( df: pd.DataFrame, decimal_places: int = 2, ) -\u0026gt; str: \u0026#34;\u0026#34;\u0026#34; Generate a Markdown-formatted summary of a pandas DataFrame. This function captures and formats the output of `df.info()`, `df.head()`, and `df.tail()` in Markdown for easy inclusion in reports, documentation, or web-based rendering (e.g., Hugo or Jupyter export workflows). Parameters: ----------- df : pd.DataFrame The DataFrame to summarize. Returns: -------- str A string containing the DataFrame\u0026#39;s info, head, and tail formatted in Markdown. Example: -------- \u0026gt;\u0026gt;\u0026gt; print(df_info_markdown(df)) ```text The columns, shape, and data types are: \u0026lt;output from df.info()\u0026gt; ``` The first 5 rows are: | | col1 | col2 | |---|------|------| | 0 | ... | ... | The last 5 rows are: ... \u0026#34;\u0026#34;\u0026#34; buffer = io.StringIO() # Capture df.info() output df.info(buf=buffer) info_str = buffer.getvalue() # Convert head and tail to Markdown head_str = df.head().to_markdown(floatfmt=f\u0026#34;.{decimal_places}f\u0026#34;) tail_str = df.tail().to_markdown(floatfmt=f\u0026#34;.{decimal_places}f\u0026#34;) markdown = [ \u0026#34;```text\u0026#34;, \u0026#34;The columns, shape, and data types are:\\n\u0026#34;, info_str, \u0026#34;```\u0026#34;, \u0026#34;\\nThe first 5 rows are:\\n\u0026#34;, head_str, \u0026#34;\\nThe last 5 rows are:\\n\u0026#34;, tail_str ] return \u0026#34;\\n\u0026#34;.join(markdown) export_track_md_deps 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 from pathlib import Path def export_track_md_deps( dep_file: Path, md_filename: str, content: str, ) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34; Export Markdown content to a file and track it as a dependency. This function writes the provided content to the specified Markdown file and appends the filename to the given dependency file (typically `index_dep.txt`). This is useful in workflows where Markdown fragments are later assembled into a larger document (e.g., a Hugo `index.md`). Parameters: ----------- dep_file : Path Path to the dependency file that tracks Markdown fragment filenames. md_filename : str The name of the Markdown file to export. content : str The Markdown-formatted content to write to the file. Returns: -------- None Example: -------- \u0026gt;\u0026gt;\u0026gt; export_track_md_deps(Path(\u0026#34;index_dep.txt\u0026#34;), \u0026#34;01_intro.md\u0026#34;, \u0026#34;# Introduction\\n...\u0026#34;) ✅ Exported and tracked: 01_intro.md \u0026#34;\u0026#34;\u0026#34; Path(md_filename).write_text(content) with dep_file.open(\u0026#34;a\u0026#34;) as f: f.write(md_filename + \u0026#34;\\n\u0026#34;) print(f\u0026#34;✅ Exported and tracked: {md_filename}\u0026#34;) load_api_keys 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 import os from dotenv import load_dotenv from pathlib import Path from settings import config # Get the environment variable file path from the configuration ENV_PATH = config(\u0026#34;ENV_PATH\u0026#34;) def load_api_keys( env_path: Path=ENV_PATH ) -\u0026gt; dict: \u0026#34;\u0026#34;\u0026#34; Load API keys from a .env file. Parameters: ----------- env_path : Path Path to the .env file. Default is the ENV_PATH from settings. Returns: -------- keys : dict Dictionary of API keys. \u0026#34;\u0026#34;\u0026#34; load_dotenv(dotenv_path=env_path) keys = { \u0026#34;INFURA_KEY\u0026#34;: os.getenv(\u0026#34;INFURA_KEY\u0026#34;), \u0026#34;NASDAQ_DATA_LINK_KEY\u0026#34;: os.getenv(\u0026#34;NASDAQ_DATA_LINK_KEY\u0026#34;), \u0026#34;COINBASE_KEY\u0026#34;: os.getenv(\u0026#34;COINBASE_KEY\u0026#34;), \u0026#34;COINBASE_SECRET\u0026#34;: os.getenv(\u0026#34;COINBASE_SECRET\u0026#34;), \u0026#34;SCHWAB_APP_KEY\u0026#34;: os.getenv(\u0026#34;SCHWAB_APP_KEY\u0026#34;), \u0026#34;SCHWAB_SECRET\u0026#34;: os.getenv(\u0026#34;SCHWAB_SECRET\u0026#34;), \u0026#34;SCHWAB_ACCOUNT_NUMBER_1\u0026#34;: os.getenv(\u0026#34;SCHWAB_ACCOUNT_NUMBER_1\u0026#34;), \u0026#34;SCHWAB_ENCRYPTED_ACCOUNT_ID_1\u0026#34;: os.getenv(\u0026#34;SCHWAB_ENCRYPTED_ACCOUNT_ID_1\u0026#34;), \u0026#34;SCHWAB_ACCOUNT_NUMBER_2\u0026#34;: os.getenv(\u0026#34;SCHWAB_ACCOUNT_NUMBER_2\u0026#34;), \u0026#34;SCHWAB_ENCRYPTED_ACCOUNT_ID_2\u0026#34;: os.getenv(\u0026#34;SCHWAB_ENCRYPTED_ACCOUNT_ID_2\u0026#34;), \u0026#34;SCHWAB_ACCOUNT_NUMBER_3\u0026#34;: os.getenv(\u0026#34;SCHWAB_ACCOUNT_NUMBER_3\u0026#34;), \u0026#34;SCHWAB_ENCRYPTED_ACCOUNT_ID_3\u0026#34;: os.getenv(\u0026#34;SCHWAB_ENCRYPTED_ACCOUNT_ID_3\u0026#34;), \u0026#34;POLYGON_KEY\u0026#34;: os.getenv(\u0026#34;POLYGON_KEY\u0026#34;), } # Raise error if any key is missing for k, v in keys.items(): if not v: raise ValueError(f\u0026#34;Missing environment variable: {k}\u0026#34;) return keys if __name__ == \u0026#34;__main__\u0026#34;: # Example usage api_keys = load_api_keys() print(\u0026#34;API keys loaded successfully.\u0026#34;) load_data 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 import pandas as pd from pathlib import Path def load_data( base_directory, ticker: str, source: str, asset_class: str, timeframe: str, file_format: str, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Load data from a CSV, Excel, or Pickle file into a pandas DataFrame. This function attempts to read a file first as a CSV, then as an Excel file (specifically looking for a sheet named \u0026#39;data\u0026#39; and using the \u0026#39;calamine\u0026#39; engine). If both attempts fail, a ValueError is raised. Parameters: ----------- base_directory Root path to read data file. ticker : str Ticker symbol to read. source : str Name of the data source (e.g., \u0026#39;Yahoo\u0026#39;). asset_class : str Asset class name (e.g., \u0026#39;Equities\u0026#39;). timeframe : str Timeframe for the data (e.g., \u0026#39;Daily\u0026#39;, \u0026#39;Month_End\u0026#39;). file_format : str Format of the file to load (\u0026#39;csv\u0026#39;, \u0026#39;excel\u0026#39;, or \u0026#39;pickle\u0026#39;) Returns: -------- pd.DataFrame The loaded data. Raises: ------- ValueError If the file could not be loaded as either CSV or Excel. Example: -------- \u0026gt;\u0026gt;\u0026gt; df = load_data(DATA_DIR, \u0026#34;^VIX\u0026#34;, \u0026#34;Yahoo_Finance\u0026#34;, \u0026#34;Indices\u0026#34;) \u0026#34;\u0026#34;\u0026#34; if file_format == \u0026#34;csv\u0026#34;: csv_path = Path(base_directory) / source / asset_class / timeframe / f\u0026#34;{ticker}.csv\u0026#34; df = pd.read_csv(csv_path) return df elif file_format == \u0026#34;excel\u0026#34;: xlsx_path = Path(base_directory) / source / asset_class / timeframe / f\u0026#34;{ticker}.xlsx\u0026#34; df = pd.read_excel(xlsx_path, sheet_name=\u0026#34;data\u0026#34;, engine=\u0026#34;calamine\u0026#34;) return df elif file_format == \u0026#34;pickle\u0026#34;: pickle_path = Path(base_directory) / source / asset_class / timeframe / f\u0026#34;{ticker}.pkl\u0026#34; df = pd.read_pickle(pickle_path) return df else: raise ValueError(f\u0026#34;❌ Unsupported file format: {file_format}. Please use \u0026#39;csv\u0026#39;, \u0026#39;excel\u0026#39;, or \u0026#39;pickle\u0026#39;.\u0026#34;) pandas_set_decimal_places 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 import pandas as pd def pandas_set_decimal_places( decimal_places: int, ) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34; Set the number of decimal places displayed for floating-point numbers in pandas. Parameters: ---------- decimal_places : int The number of decimal places to display for float values in pandas DataFrames and Series. Returns: -------- None Example: -------- \u0026gt;\u0026gt;\u0026gt; dp(3) \u0026gt;\u0026gt;\u0026gt; pd.DataFrame([1.23456789]) 0 0 1.235 \u0026#34;\u0026#34;\u0026#34; pd.set_option(\u0026#39;display.float_format\u0026#39;, lambda x: f\u0026#39;%.{decimal_places}f\u0026#39; % x) plot_timeseries plot_stats 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 import matplotlib.pyplot as plt import pandas as pd from matplotlib.ticker import MultipleLocator def plot_stats( stats_df: pd.DataFrame, plot_columns, title: str, x_label: str, x_rotation: int, x_tick_spacing: int, y_label: str, y_tick_spacing: int, grid: bool, legend: bool, export_plot: bool, plot_file_name: str, ) -\u0026gt; None: \u0026#34;\u0026#34;\u0026#34; Plot the price data from a DataFrame for a specified date range and columns. Parameters: ----------- stats_df : pd.DataFrame DataFrame containing the price data to plot. plot_columns : str OR list List of columns to plot from the DataFrame. If none, all columns will be plotted. title : str Title of the plot. x_label : str Label for the x-axis. x_rotation : int Rotation angle for the x-axis date labels. x_tick_spacing : int Spacing for the x-axis ticks. y_label : str Label for the y-axis. y_tick_spacing : int Spacing for the y-axis ticks. grid : bool Whether to display a grid on the plot. legend : bool Whether to display a legend on the plot. export_plot : bool Whether to save the figure as a PNG file. plot_file_name : str File name for saving the figure (if save_fig is True). Returns: -------- None \u0026#34;\u0026#34;\u0026#34; # Set plot figure size and background color plt.figure(figsize=(12, 6), facecolor=\u0026#34;#F5F5F5\u0026#34;) # Plot data if plot_columns == \u0026#34;All\u0026#34;: for col in stats_df.columns: plt.scatter(stats_df.index, stats_df[col], label=col, linestyle=\u0026#39;-\u0026#39;, linewidth=1.5) else: for col in plot_columns: plt.scatter(stats_df.index, stats_df[col], label=col, linestyle=\u0026#39;-\u0026#39;, linewidth=1.5) # Format X axis plt.gca().xaxis.set_major_locator(MultipleLocator(x_tick_spacing)) plt.xlabel(x_label, fontsize=10) plt.xticks(rotation=x_rotation, fontsize=8) # Format Y axis plt.gca().yaxis.set_major_locator(MultipleLocator(y_tick_spacing)) plt.ylabel(y_label, fontsize=10) plt.yticks(fontsize=8) # Format title, layout, grid, and legend plt.title(title, fontsize=12) plt.tight_layout() if grid == True: plt.grid(True, linestyle=\u0026#39;--\u0026#39;, alpha=0.7) if legend == True: plt.legend(fontsize=9) # Save figure and display plot if export_plot == True: plt.savefig(f\u0026#34;{plot_file_name}.png\u0026#34;, dpi=300, bbox_inches=\u0026#34;tight\u0026#34;) # Display the plot plt.show() return None plot_vix_with_trades 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 import matplotlib.dates as mdates import matplotlib.pyplot as plt import pandas as pd from matplotlib.ticker import MultipleLocator def plot_vix_with_trades( vix_price_df: pd.DataFrame, trades_df: pd.DataFrame, plot_start_date: str, plot_end_date: str, x_tick_spacing: int, y_tick_spacing: int, index_number: str, export_plot: bool, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Plot the VIX daily high and low prices, along with the VIX spikes, and trades. Parameters: ----------- vix_price_df : pd.DataFrame Dataframe containing the VIX price data to plot. trades_df : pd.DataFrame Dataframe containing the trades data. plot_start_date : str Start date for the plot in \u0026#39;YYYY-MM-DD\u0026#39; format. plot_end_date : str End date for the plot in \u0026#39;YYYY-MM-DD\u0026#39; format. index_number : str Index number to be used in the file name of the plot export. export_plot : bool Whether to save the figure as a PNG file. Returns: -------- vix_data : pd.DataFrame Dataframe containing the VIX price data for the specified timeframe. \u0026#34;\u0026#34;\u0026#34; # Create temporary dataframe for the specified date range vix_data = vix_price_df[(vix_price_df.index \u0026gt;= plot_start_date) \u0026amp; (vix_price_df.index \u0026lt;= plot_end_date)] # Set plot figure size and background color plt.figure(figsize=(12, 6), facecolor=\u0026#34;#F5F5F5\u0026#34;) # Plot VIX high and low price data plt.plot(vix_data.index, vix_data[\u0026#39;High\u0026#39;], label=\u0026#39;High\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;steelblue\u0026#39;, linewidth=1) plt.plot(vix_data.index, vix_data[\u0026#39;Low\u0026#39;], label=\u0026#39;Low\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;brown\u0026#39;, linewidth=1) # Plot VIX spikes plt.scatter(vix_data[vix_data[\u0026#39;Spike_SMA\u0026#39;] == True].index, vix_data[vix_data[\u0026#39;Spike_SMA\u0026#39;] == True][\u0026#39;High\u0026#39;], label=\u0026#39;Spike (High \u0026gt; 1.25 * 10 Day High SMA)\u0026#39;, color=\u0026#39;black\u0026#39;, s=20) # Plot trades plt.scatter(trades_df[\u0026#39;Trade_Date\u0026#39;], trades_df[\u0026#39;Approx_VIX_Level\u0026#39;], label=\u0026#39;Trades\u0026#39;, color=\u0026#39;red\u0026#39;, s=20) # Annotate each point in trades_df with the corresponding Action_Symbol for _, row in trades_df.iterrows(): plt.text( row[\u0026#39;Trade_Date\u0026#39;] + pd.Timedelta(days=1), row[\u0026#39;Approx_VIX_Level\u0026#39;] + 0.1, row[\u0026#39;TradeDate_Action_Symbol_VIX\u0026#39;], fontsize=9 ) # Format X axis plt.gca().xaxis.set_major_locator(mdates.DayLocator(interval=x_tick_spacing)) plt.gca().xaxis.set_major_formatter(mdates.DateFormatter(\u0026#34;%Y-%m-%d\u0026#34;)) plt.xlabel(\u0026#34;Date\u0026#34;, fontsize=10) plt.xticks(rotation=45, fontsize=8) # Format Y axis plt.gca().yaxis.set_major_locator(MultipleLocator(y_tick_spacing)) plt.ylabel(\u0026#34;VIX\u0026#34;, fontsize=10) plt.yticks(fontsize=8) # Format title, layout, grid, and legend plt.title(f\u0026#34;CBOE Volatility Index (VIX), VIX Spikes, Trades, {plot_start_date} - {plot_end_date}\u0026#34;, fontsize=12) plt.tight_layout() plt.grid(True, linestyle=\u0026#39;--\u0026#39;, alpha=0.7) plt.legend(fontsize=9) # Save figure and display plot if export_plot == True: # plt.savefig(f\u0026#34;{index_number}_VIX_Spike_Trades_{plot_start_date}_{plot_end_date}.png\u0026#34;, dpi=300, bbox_inches=\u0026#34;tight\u0026#34;) plt.savefig(f\u0026#34;{index_number}_VIX_Spike_Trades.png\u0026#34;, dpi=300, bbox_inches=\u0026#34;tight\u0026#34;) # Display the plot plt.show() return vix_data polygon_fetch_full_history 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 import pandas as pd import time from datetime import datetime, timedelta from load_api_keys import load_api_keys from polygon import RESTClient from settings import config # Load API keys from the environment api_keys = load_api_keys() # Get the environment variable for where data is stored DATA_DIR = config(\u0026#34;DATA_DIR\u0026#34;) def polygon_fetch_full_history( client, ticker: str, timespan: str, multiplier: int, adjusted: bool, existing_history_df: pd.DataFrame, current_start: datetime, free_tier: bool, verbose: bool, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Fetch full historical data for a given product from Polygon API. Parameters: ----------- client Polygon API client instance. ticker : str Ticker symbol to download. timespan : str Time span for the data (e.g., \u0026#34;minute\u0026#34;, \u0026#34;hour\u0026#34;, \u0026#34;day\u0026#34;, \u0026#34;week\u0026#34;, \u0026#34;month\u0026#34;, \u0026#34;quarter\u0026#34;, \u0026#34;year\u0026#34;). multiplier : int Multiplier for the time span (e.g., 1 for daily data). adjusted : bool If True, return adjusted data; if False, return raw data. full_history_df : pd.DataFrame DataFrame containing the data. current_start : datetime Date for which to start pulling data in datetime format. free_tier : bool If True, then pause to avoid API limits. verbose : bool If True, print detailed information about the data being processed. Returns: -------- full_history_df : pd.DataFrame DataFrame containing the data. \u0026#34;\u0026#34;\u0026#34; # Copy DataFrame full_history_df = existing_history_df.copy() if timespan == \u0026#34;minute\u0026#34;: time_delta = 15 time_overlap = 1 elif timespan == \u0026#34;hour\u0026#34;: time_delta = 15 time_overlap = 1 elif timespan == \u0026#34;day\u0026#34;: time_delta = 180 time_overlap = 1 else: raise Exception(f\u0026#34;Invalid {timespan}.\u0026#34;) new_data_last_date = None new_date_last_date_check = None while current_start \u0026lt; datetime.now(): # Offset end date by time_delta current_end = current_start + timedelta(days=time_delta) if verbose == True: print(f\u0026#34;Pulling {timespan} data for {current_start} thru {current_end} for {ticker}...\\n\u0026#34;) try: # Pull new data aggs = client.get_aggs( ticker=ticker, timespan=timespan, multiplier=multiplier, from_=current_start, to=current_end, adjusted=adjusted, sort=\u0026#34;asc\u0026#34;, limit=5000, ) # if len(aggs) == 0: # raise Exception(f\u0026#34;No data is available for {ticker} for {current_start} thru {current_end}. Please attempt different dates.\u0026#34;) # Convert to DataFrame new_data = pd.DataFrame([bar.__dict__ for bar in aggs]) new_data[\u0026#34;timestamp\u0026#34;] = pd.to_datetime(new_data[\u0026#34;timestamp\u0026#34;], unit=\u0026#34;ms\u0026#34;) new_data = new_data.rename(columns = {\u0026#39;timestamp\u0026#39;:\u0026#39;Date\u0026#39;}) new_data = new_data[[\u0026#39;Date\u0026#39;, \u0026#39;open\u0026#39;, \u0026#39;high\u0026#39;, \u0026#39;low\u0026#39;, \u0026#39;close\u0026#39;, \u0026#39;volume\u0026#39;, \u0026#39;vwap\u0026#39;, \u0026#39;transactions\u0026#39;, \u0026#39;otc\u0026#39;]] new_data = new_data.sort_values(by=\u0026#39;Date\u0026#39;, ascending=True) # Enforce dtypes to match full_history_df new_data = new_data.astype(full_history_df.dtypes.to_dict()) # (Optional) reorder columns to match schema # new_data = new_data[full_history_df.columns] # Find last date in new_data new_data_last_date = new_data[\u0026#39;Date\u0026#39;].max() if verbose == True: print(\u0026#34;New data:\u0026#34;) print(new_data) # No longer necessary to check for 5000 rows of data # Check if new data contains 5000 rows # if len(new_data) == 5000: # raise Exception(f\u0026#34;New data for {ticker} contains 5000 rows, indicating potential issues with data completeness or API limits.\u0026#34;) # If full_history_df length is not 0, check to confirm that data overlaps to verify that there were not any splits in the data # if not full_history_df.empty: # if not full_history_df[\u0026#39;Date\u0026#39;].isin(new_data[\u0026#39;Date\u0026#39;]).any(): # raise Exception(f\u0026#34;New data does not overlap with existing data.\u0026#34;) if not full_history_df.empty: # Columns present in both frames common_cols = list(full_history_df.columns.intersection(new_data.columns)) if not common_cols: raise Exception(\u0026#34;No common columns to compare.\u0026#34;) # (Optional) de-duplicate to speed up the merge full_dedup = full_history_df[common_cols].drop_duplicates() new_dedup = new_data[common_cols].drop_duplicates() # Inner join on every shared column = exact row matches overlap = full_dedup.merge(new_dedup, on=common_cols, how=\u0026#34;inner\u0026#34;) if overlap.empty: raise Exception(f\u0026#34;New data does not overlap with existing data (full-row check).\u0026#34;) # Combine existing data with recent data, drop duplicates, sort values, reset index full_history_df = pd.concat([full_history_df, new_data]) full_history_df = full_history_df.drop_duplicates(subset=\u0026#34;Date\u0026#34;, keep=\u0026#34;last\u0026#34;) full_history_df = full_history_df.sort_values(by=\u0026#39;Date\u0026#39;,ascending=True) full_history_df = full_history_df.reset_index(drop=True) if verbose == True: print(\u0026#34;Combined data:\u0026#34;) print(full_history_df) except KeyError as e: print(f\u0026#34;No data is available for {ticker} from {current_start} thru {current_end}.\u0026#34;) user_choice = input( f\u0026#34;Press Enter to continue, or type \u0026#39;q\u0026#39; to quit: \u0026#34; ) if user_choice.lower() == \u0026#34;q\u0026#34;: print(f\u0026#34;Aborting operation to update {ticker} {timespan} data.\u0026#34;) break # break out of the while loop else: print(f\u0026#34;Trying next timeframe for {ticker} {timespan} data.\u0026#34;) # Set last_data_date to current_end because we know data was not available # up until current_end new_data_last_date = current_end pass except Exception as e: print(f\u0026#34;Failed to pull {timespan} data for {current_start} thru {current_end} for {ticker}: {e}\u0026#34;) raise # Re-raise the original exception # Break out of loop if data is up-to-date (or close to being up-to-date because it is # possible that entire range was not pulled due to the way API handles hour data # from minute data) if current_end \u0026gt; datetime.now(): break else: # Edge case, if the last date for new_date is exactly time_overlap\u0026#39;s duration # past current_start if new_date_last_date_check == new_data_last_date: current_start = current_end - timedelta(days=time_overlap) new_date_last_date_check = new_data_last_date else: current_start = new_data_last_date - timedelta(days=time_overlap) new_date_last_date_check = new_data_last_date # Code below is likely not necessary # # Overlap with existing data to capture all data but check to see if # # current_end is a Sunday and if so ensure overlap covers a trading day # if current_end.weekday() == 6: # current_start = last_data_date - timedelta(days=(time_overlap+1)) # else: # current_start = last_data_date - timedelta(days=time_overlap) # Check for free tier and if so then pause for 12 seconds to avoid hitting API rate limits if free_tier == True: if verbose == True: print(f\u0026#34;Sleeping for 12 seconds to avoid hitting API rate limits...\\n\u0026#34;) time.sleep(12) # Return the DataFrame containing the full history return full_history_df if __name__ == \u0026#34;__main__\u0026#34;: current_year = datetime.now().year current_month = datetime.now().month current_day = datetime.now().day # Open client connection client = RESTClient(api_key=api_keys[\u0026#34;POLYGON_KEY\u0026#34;]) # Create an empty DataFrame df = pd.DataFrame({ \u0026#39;Date\u0026#39;: pd.Series(dtype=\u0026#34;datetime64[ns]\u0026#34;), \u0026#39;open\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;high\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;low\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;close\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;volume\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;vwap\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;transactions\u0026#39;: pd.Series(dtype=\u0026#34;int64\u0026#34;), \u0026#39;otc\u0026#39;: pd.Series(dtype=\u0026#34;object\u0026#34;) }) # Example usage - minute df = polygon_fetch_full_history( client=client, ticker=\u0026#34;SPY\u0026#34;, timespan=\u0026#34;day\u0026#34;, multiplier=1, adjusted=True, existing_history_df=df, current_start=datetime(current_year - 2, current_month, current_day), free_tier=True, verbose=True, ) polygon_pull_data 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 import os import pandas as pd import time from datetime import datetime, timedelta from IPython.display import display from load_api_keys import load_api_keys from polygon import RESTClient from polygon_fetch_full_history import polygon_fetch_full_history from settings import config # Load API keys from the environment api_keys = load_api_keys() # Get the environment variable for where data is stored DATA_DIR = config(\u0026#34;DATA_DIR\u0026#34;) def polygon_pull_data( base_directory, ticker: str, source: str, asset_class: str, start_date: datetime, timespan: str, multiplier: int, adjusted: bool, force_existing_check: bool, free_tier: bool, verbose: bool, excel_export: bool, pickle_export: bool, output_confirmation: bool, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Read existing data file, download price data from Polygon, and export data. Parameters: ----------- base_directory : any Root path to store downloaded data. ticker : str Ticker symbol to download. source : str Name of the data source (e.g., \u0026#39;Polygon\u0026#39;). asset_class : str Asset class name (e.g., \u0026#39;Equities\u0026#39;). start_date : datetime Start date for the data in datetime format. timespan : str Time span for the data (e.g., \u0026#34;minute\u0026#34;, \u0026#34;hour\u0026#34;, \u0026#34;day\u0026#34;, \u0026#34;week\u0026#34;, \u0026#34;month\u0026#34;, \u0026#34;quarter\u0026#34;, \u0026#34;year\u0026#34;). multiplier : int Multiplier for the time span (e.g., 1 for daily data). adjusted : bool If True, return adjusted data; if False, return raw data. force_existing_check : bool If True, force a complete check of the existing data file to verify that there are not any gaps in the data. free_tier : bool If True, then pause to avoid API limits. verbose : bool If True, print detailed information about the data being processed. excel_export : bool If True, export data to Excel format. pickle_export : bool If True, export data to Pickle format. output_confirmation : bool If True, print confirmation message. Returns: -------- None \u0026#34;\u0026#34;\u0026#34; # Open client connection client = RESTClient(api_key=api_keys[\u0026#34;POLYGON_KEY\u0026#34;]) # Set file location based on parameters file_location = f\u0026#34;{base_directory}/{source}/{asset_class}/{timespan}/{ticker}.pkl\u0026#34; if timespan == \u0026#34;minute\u0026#34;: time_delta = 15 time_overlap = 1 elif timespan == \u0026#34;hour\u0026#34;: time_delta = 15 time_overlap = 1 elif timespan == \u0026#34;day\u0026#34;: time_delta = 180 time_overlap = 1 else: raise Exception(f\u0026#34;Invalid {timespan}.\u0026#34;) try: # Attempt to read existing pickle data file existing_history_df = pd.read_pickle(file_location) # Reset index if \u0026#39;Date\u0026#39; is column is the index if \u0026#39;Date\u0026#39; not in existing_history_df.columns: existing_history_df = existing_history_df.reset_index() print(f\u0026#34;File found...updating the {ticker} {timespan} data.\u0026#34;) if verbose ==True: print(\u0026#34;Existing data:\u0026#34;) print(existing_history_df) # Find last date in existing data last_data_date = existing_history_df[\u0026#39;Date\u0026#39;].max() print(f\u0026#34;Last date in existing data: {last_data_date}\u0026#34;) starting_rows = len(existing_history_df) print(f\u0026#34;Number of rows in existing data: {starting_rows}\u0026#34;) # Overlap with existing data to capture all data current_start = last_data_date - timedelta(days=time_overlap) except FileNotFoundError: # Print error print(f\u0026#34;File not found...downloading the {ticker} {timespan} data.\u0026#34;) # Create an empty DataFrame existing_history_df = pd.DataFrame({ \u0026#39;Date\u0026#39;: pd.Series(dtype=\u0026#34;datetime64[ns]\u0026#34;), \u0026#39;open\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;high\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;low\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;close\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;volume\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;vwap\u0026#39;: pd.Series(dtype=\u0026#34;float64\u0026#34;), \u0026#39;transactions\u0026#39;: pd.Series(dtype=\u0026#34;int64\u0026#34;), \u0026#39;otc\u0026#39;: pd.Series(dtype=\u0026#34;object\u0026#34;) }) # Set current date to start date current_start = start_date # Check for force_existing_check flag if force_existing_check == True: print(\u0026#34;Forcing check of existing data...\u0026#34;) current_start = start_date full_history_df = polygon_fetch_full_history( client=client, ticker=ticker, timespan=timespan, multiplier=multiplier, adjusted=adjusted, existing_history_df=existing_history_df, current_start=current_start, free_tier=free_tier, verbose=verbose, ) # Create directory directory = f\u0026#34;{base_directory}/{source}/{asset_class}/{timespan}\u0026#34; os.makedirs(directory, exist_ok=True) # Export to Excel if excel_export == True: print(f\u0026#34;Exporting {ticker} {timespan} data to Excel...\u0026#34;) full_history_df.to_excel(f\u0026#34;{directory}/{ticker}.xlsx\u0026#34;, sheet_name=\u0026#34;data\u0026#34;) # Export to Pickle if pickle_export == True: print(f\u0026#34;Exporting {ticker} {timespan} data to Pickle...\u0026#34;) full_history_df.to_pickle(f\u0026#34;{directory}/{ticker}.pkl\u0026#34;) total_rows = len(full_history_df) # Output confirmation if output_confirmation == True: print(f\u0026#34;The first and last date of {timespan} data for {ticker} is: \u0026#34;) display(full_history_df[:1]) display(full_history_df[-1:]) print(f\u0026#34;Number of rows after data update: {total_rows}\u0026#34;) if starting_rows: print(f\u0026#34;Number of rows added during update: {total_rows - starting_rows}\u0026#34;) print(f\u0026#34;Polygon data complete for {ticker} {timespan} data.\u0026#34;) print(f\u0026#34;--------------------\u0026#34;) return full_history_df if __name__ == \u0026#34;__main__\u0026#34;: current_year = datetime.now().year current_month = datetime.now().month current_day = datetime.now().day # Stock Data equities = [\u0026#34;AMZN\u0026#34;, \u0026#34;AAPL\u0026#34;] # Iterate through each stock for stock in equities: # Example usage - minute polygon_pull_data( base_directory=DATA_DIR, ticker=stock, source=\u0026#34;Polygon\u0026#34;, asset_class=\u0026#34;Equities\u0026#34;, start_date=datetime(current_year - 2, current_month, current_day), timespan=\u0026#34;minute\u0026#34;, multiplier=1, adjusted=True, force_existing_check=False, free_tier=True, verbose=False, excel_export=True, pickle_export=True, output_confirmation=True, ) time.sleep(12) # Example usage - hourly polygon_pull_data( base_directory=DATA_DIR, ticker=stock, source=\u0026#34;Polygon\u0026#34;, asset_class=\u0026#34;Equities\u0026#34;, start_date=datetime(current_year - 2, current_month, current_day), timespan=\u0026#34;hour\u0026#34;, multiplier=1, adjusted=True, force_existing_check=False, free_tier=True, verbose=False, excel_export=True, pickle_export=True, output_confirmation=True, ) time.sleep(12) # Example usage - daily polygon_pull_data( base_directory=DATA_DIR, ticker=stock, source=\u0026#34;Polygon\u0026#34;, asset_class=\u0026#34;Equities\u0026#34;, start_date=datetime(current_year - 2, current_month, current_day), timespan=\u0026#34;day\u0026#34;, multiplier=1, adjusted=True, force_existing_check=False, free_tier=True, verbose=False, excel_export=True, pickle_export=True, output_confirmation=True, ) time.sleep(12) strategy_harry_brown_perm_port 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 import pandas as pd def strategy_harry_brown_perm_port( fund_list: str, starting_cash: int, cash_contrib: int, close_prices_df: pd.DataFrame, rebal_month: int, rebal_day: int, rebal_per_high: float, rebal_per_low: float, excel_export: bool, pickle_export: bool, output_confirmation: bool, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Execute the re-balance strategy based on specified criteria. Parameters: ----------- fund_list (str): List of funds for data to be combined from. Funds are strings in the form \u0026#34;BTC-USD\u0026#34;. starting_cash (int): Starting investment balance. cash_contrib (int): Cash contribution to be made daily. close_prices_df (pd.DataFrame): DataFrame containing date and close prices for all funds to be included. rebal_month (int): Month for annual rebalance. rebal_day (int): Day for annual rebalance. rebal_per_high (float): High percentage for rebalance. rebal_per_low (float): Low percentage for rebalance. excel_export : bool If True, export data to Excel format. pickle_export : bool If True, export data to Pickle format. output_confirmation : bool If True, print confirmation message. Returns: -------- df (pd.DataFrame): DataFrame containing strategy data for all funds to be included. Also dumps the df to excel for reference later. \u0026#34;\u0026#34;\u0026#34; num_funds = len(fund_list) df = close_prices_df.copy() df.reset_index(inplace = True) # Date to be used for annual rebalance target_month = rebal_month target_day = rebal_day # Create a dataframe with dates from the specific month rebal_date = df[df[\u0026#39;Date\u0026#39;].dt.month == target_month] # Specify the date or the next closest rebal_date = rebal_date[rebal_date[\u0026#39;Date\u0026#39;].dt.day \u0026gt;= target_day] # Group by year and take the first entry for each year rebal_dates_by_year = rebal_date.groupby(rebal_date[\u0026#39;Date\u0026#39;].dt.year).first().reset_index(drop=True) \u0026#39;\u0026#39;\u0026#39; Column order for the dataframe: df[fund + \u0026#34;_BA_Shares\u0026#34;] df[fund + \u0026#34;_BA_$_Invested\u0026#34;] df[fund + \u0026#34;_BA_Port_%\u0026#34;] df[\u0026#39;Total_BA_$_Invested\u0026#39;] df[\u0026#39;Contribution\u0026#39;] df[\u0026#39;Rebalance\u0026#39;] df[fund + \u0026#34;_AA_Shares\u0026#34;] df[fund + \u0026#34;_AA_$_Invested\u0026#34;] df[fund + \u0026#34;_AA_Port_%\u0026#34;] df[\u0026#39;Total_AA_$_Invested\u0026#39;] \u0026#39;\u0026#39;\u0026#39; # Calculate the columns and initial values for before action (BA) shares, $ invested, and port % for fund in fund_list: df[fund + \u0026#34;_BA_Shares\u0026#34;] = starting_cash / num_funds / df[fund + \u0026#34;_Close\u0026#34;] df[fund + \u0026#34;_BA_$_Invested\u0026#34;] = df[fund + \u0026#34;_BA_Shares\u0026#34;] * df[fund + \u0026#34;_Close\u0026#34;] df[fund + \u0026#34;_BA_Port_%\u0026#34;] = 0.25 # Set column values initially df[\u0026#39;Total_BA_$_Invested\u0026#39;] = starting_cash df[\u0026#39;Contribution\u0026#39;] = cash_contrib df[\u0026#39;Rebalance\u0026#39;] = \u0026#34;No\u0026#34; # Set columns and values initially for after action (AA) shares, $ invested, and port % for fund in fund_list: df[fund + \u0026#34;_AA_Shares\u0026#34;] = starting_cash / num_funds / df[fund + \u0026#34;_Close\u0026#34;] df[fund + \u0026#34;_AA_$_Invested\u0026#34;] = df[fund + \u0026#34;_AA_Shares\u0026#34;] * df[fund + \u0026#34;_Close\u0026#34;] df[fund + \u0026#34;_AA_Port_%\u0026#34;] = 0.25 # Set column value for after action (AA) total $ invested df[\u0026#39;Total_AA_$_Invested\u0026#39;] = starting_cash # Iterate through the dataframe and execute the strategy for index, row in df.iterrows(): # Ensure there\u0026#39;s a previous row to reference by checking the index value if index \u0026gt; 0: # Initialize variable Total_BA_Invested = 0 # Calculate before action (BA) shares and $ invested values for fund in fund_list: df.at[index, fund + \u0026#34;_BA_Shares\u0026#34;] = df.at[index - 1, fund + \u0026#34;_AA_Shares\u0026#34;] df.at[index, fund + \u0026#34;_BA_$_Invested\u0026#34;] = df.at[index, fund + \u0026#34;_BA_Shares\u0026#34;] * row[fund + \u0026#34;_Close\u0026#34;] # Sum the asset values to find the total Total_BA_Invested = Total_BA_Invested + df.at[index, fund + \u0026#34;_BA_$_Invested\u0026#34;] # Calculate before action (BA) port % values for fund in fund_list: df.at[index, fund + \u0026#34;_BA_Port_%\u0026#34;] = df.at[index, fund + \u0026#34;_BA_$_Invested\u0026#34;] / Total_BA_Invested # Set column for before action (BA) total $ invested df.at[index, \u0026#39;Total_BA_$_Invested\u0026#39;] = Total_BA_Invested # Initialize variables rebalance = \u0026#34;No\u0026#34; date = row[\u0026#39;Date\u0026#39;] # Check for a specific date annually # Simple if statement to check if date_to_check is in jan_28_or_after_each_year if date in rebal_dates_by_year[\u0026#39;Date\u0026#39;].values: rebalance = \u0026#34;Yes\u0026#34; else: pass # Check to see if any asset has portfolio percentage of greater than 35% or less than 15% and if so set variable for fund in fund_list: if df.at[index, fund + \u0026#34;_BA_Port_%\u0026#34;] \u0026gt; rebal_per_high or df.at[index, fund + \u0026#34;_BA_Port_%\u0026#34;] \u0026lt; rebal_per_low: rebalance = \u0026#34;Yes\u0026#34; else: pass # If rebalance is required, rebalance back to 25% for each asset, else just divide contribution evenly across assets if rebalance == \u0026#34;Yes\u0026#34;: df.at[index, \u0026#39;Rebalance\u0026#39;] = rebalance for fund in fund_list: df.at[index, fund + \u0026#34;_AA_$_Invested\u0026#34;] = (Total_BA_Invested + df.at[index, \u0026#39;Contribution\u0026#39;]) * 0.25 else: df.at[index, \u0026#39;Rebalance\u0026#39;] = rebalance for fund in fund_list: df.at[index, fund + \u0026#34;_AA_$_Invested\u0026#34;] = df.at[index, fund + \u0026#34;_BA_$_Invested\u0026#34;] + df.at[index, \u0026#39;Contribution\u0026#39;] * 0.25 # Initialize variable Total_AA_Invested = 0 # Set column values for after action (AA) shares and port % for fund in fund_list: df.at[index, fund + \u0026#34;_AA_Shares\u0026#34;] = df.at[index, fund + \u0026#34;_AA_$_Invested\u0026#34;] / row[fund + \u0026#34;_Close\u0026#34;] # Sum the asset values to find the total Total_AA_Invested = Total_AA_Invested + df.at[index, fund + \u0026#34;_AA_$_Invested\u0026#34;] # Calculate after action (AA) port % values for fund in fund_list: df.at[index, fund + \u0026#34;_AA_Port_%\u0026#34;] = df.at[index, fund + \u0026#34;_AA_$_Invested\u0026#34;] / Total_AA_Invested # Set column for after action (AA) total $ invested df.at[index, \u0026#39;Total_AA_$_Invested\u0026#39;] = Total_AA_Invested # If this is the first row else: pass df[\u0026#39;Return\u0026#39;] = df[\u0026#39;Total_AA_$_Invested\u0026#39;].pct_change() df[\u0026#39;Cumulative_Return\u0026#39;] = (1 + df[\u0026#39;Return\u0026#39;]).cumprod() plan_name = \u0026#39;_\u0026#39;.join(fund_list) # Export to excel if excel_export == True: df.to_excel(f\u0026#34;{plan_name}_Strategy.xlsx\u0026#34;, sheet_name=\u0026#34;data\u0026#34;) else: pass # Export to pickle if pickle_export == True: df.to_pickle(f\u0026#34;{plan_name}_Strategy.pkl\u0026#34;) else: pass # Output confirmation if output_confirmation == True: print(f\u0026#34;Strategy complete for {plan_name}\u0026#34;) else: pass return df summary_stats 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 import pandas as pd import numpy as np def summary_stats( fund_list: list[str], df: pd.DataFrame, period: str, use_calendar_days: bool, excel_export: bool, pickle_export: bool, output_confirmation: bool, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Calculate summary statistics for the given fund list and return data. Parameters: ----------- fund_list (str): List of funds. This is used below in the excel/pickle export but not in the analysis.. Funds are strings in the form \u0026#34;BTC-USD\u0026#34;. df (pd.DataFrame): Dataframe with return data. Assumes returns are in decimal format (e.g., 0.05 for 5%), and assumes there is only 1 column. period (str): Period for which to calculate statistics. Options are \u0026#34;Monthly\u0026#34;, \u0026#34;Weekly\u0026#34;, \u0026#34;Daily\u0026#34;. use_calendar_days (bool): If True, use calendar days for calculations. If False, use trading days. excel_export : bool If True, export data to Excel format. pickle_export : bool If True, export data to Pickle format. output_confirmation : bool If True, print confirmation message. Returns: -------- df_stats (pd.DataFrame): pd.DataFrame: DataFrame containing various portfolio statistics. \u0026#34;\u0026#34;\u0026#34; # Get the period in proper format period = period.strip().capitalize() # Map base timeframes period_to_timeframe = { \u0026#34;Monthly\u0026#34;: 12, \u0026#34;Weekly\u0026#34;: 52, \u0026#34;Daily\u0026#34;: 365 if use_calendar_days else 252, } try: timeframe = period_to_timeframe[period] except KeyError: raise ValueError(f\u0026#34;Invalid period: {period}. Must be one of {list(period_to_timeframe.keys())}\u0026#34;) df_stats = pd.DataFrame(df.mean(axis=0) * timeframe) # annualized df_stats.columns = [\u0026#39;Annualized Mean\u0026#39;] df_stats[\u0026#39;Annualized Volatility\u0026#39;] = df.std() * np.sqrt(timeframe) # annualized df_stats[\u0026#39;Annualized Sharpe Ratio\u0026#39;] = df_stats[\u0026#39;Annualized Mean\u0026#39;] / df_stats[\u0026#39;Annualized Volatility\u0026#39;] df_cagr = (1 + df[df.columns[0]]).cumprod() cagr = (df_cagr.iloc[-1] / 1) ** ( 1 / (len(df_cagr) / timeframe)) - 1 df_stats[\u0026#39;CAGR\u0026#39;] = cagr df_stats[f\u0026#39;{period} Max Return\u0026#39;] = df.max() df_stats[f\u0026#39;{period} Max Return (Date)\u0026#39;] = df.idxmax().values[0] df_stats[f\u0026#39;{period} Min Return\u0026#39;] = df.min() df_stats[f\u0026#39;{period} Min Return (Date)\u0026#39;] = df.idxmin().values[0] wealth_index = 1000 * (1 + df).cumprod() previous_peaks = wealth_index.cummax() drawdowns = (wealth_index - previous_peaks) / previous_peaks df_stats[\u0026#39;Max Drawdown\u0026#39;] = drawdowns.min() df_stats[\u0026#39;Peak\u0026#39;] = [previous_peaks[col][:drawdowns[col].idxmin()].idxmax() for col in previous_peaks.columns] df_stats[\u0026#39;Trough\u0026#39;] = drawdowns.idxmin() recovery_date = [] for col in wealth_index.columns: prev_max = previous_peaks[col][:drawdowns[col].idxmin()].max() recovery_wealth = pd.DataFrame([wealth_index[col][drawdowns[col].idxmin():]]).T recovery_date.append(recovery_wealth[recovery_wealth[col] \u0026gt;= prev_max].index.min()) df_stats[\u0026#39;Recovery Date\u0026#39;] = recovery_date df_stats[\u0026#39;Days to Recover\u0026#39;] = (df_stats[\u0026#39;Recovery Date\u0026#39;] - df_stats[\u0026#39;Trough\u0026#39;]).dt.days df_stats[\u0026#39;MAR Ratio\u0026#39;] = df_stats[\u0026#39;CAGR\u0026#39;] / -df_stats[\u0026#39;Max Drawdown\u0026#39;] plan_name = \u0026#39;_\u0026#39;.join(fund_list) # Export to excel if excel_export == True: df_stats.to_excel(f\u0026#34;{plan_name}_Summary_Stats.xlsx\u0026#34;, sheet_name=\u0026#34;data\u0026#34;) else: pass # Export to pickle if pickle_export == True: df_stats.to_pickle(f\u0026#34;{plan_name}_Summary_Stats.pkl\u0026#34;) else: pass # Output confirmation if output_confirmation == True: print(f\u0026#34;Summary stats complete for {plan_name}\u0026#34;) else: pass return df_stats yf_pull_data 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 import os import pandas as pd import yfinance as yf from IPython.display import display def yf_pull_data( base_directory, ticker: str, source: str, asset_class: str, excel_export: bool, pickle_export: bool, output_confirmation: bool, ) -\u0026gt; pd.DataFrame: \u0026#34;\u0026#34;\u0026#34; Download daily price data from Yahoo Finance and export it. Parameters: ----------- base_directory Root path to store downloaded data. ticker : str Ticker symbol to download. source : str Name of the data source (e.g., \u0026#39;Yahoo\u0026#39;). asset_class : str Asset class name (e.g., \u0026#39;Equities\u0026#39;). excel_export : bool If True, export data to Excel format. pickle_export : bool If True, export data to Pickle format. output_confirmation : bool If True, print confirmation message. Returns: -------- df : pd.DataFrame DataFrame containing the downloaded data. \u0026#34;\u0026#34;\u0026#34; # Download data from YF df = yf.download(ticker, start=\u0026#34;1900-01-01\u0026#34;) # Drop the column level with the ticker symbol df.columns = df.columns.droplevel(1) # Reset index df = df.reset_index() # Remove the \u0026#34;Price\u0026#34; header from the index df.columns.name = None # Reset date column df[\u0026#39;Date\u0026#39;] = df[\u0026#39;Date\u0026#39;].dt.tz_localize(None) # Set \u0026#39;Date\u0026#39; column as index df = df.set_index(\u0026#39;Date\u0026#39;, drop=True) # Drop data from last day because it\u0026#39;s not accrate until end of day df = df.drop(df.index[-1]) # Create directory directory = f\u0026#34;{base_directory}/{source}/{asset_class}/Daily\u0026#34; os.makedirs(directory, exist_ok=True) # Export to excel if excel_export == True: df.to_excel(f\u0026#34;{directory}/{ticker}.xlsx\u0026#34;, sheet_name=\u0026#34;data\u0026#34;) else: pass # Export to pickle if pickle_export == True: df.to_pickle(f\u0026#34;{directory}/{ticker}.pkl\u0026#34;) else: pass # Output confirmation if output_confirmation == True: print(f\u0026#34;The first and last date of data for {ticker} is: \u0026#34;) display(df[:1]) display(df[-1:]) print(f\u0026#34;Yahoo Finance data complete for {ticker}\u0026#34;) print(f\u0026#34;--------------------\u0026#34;) else: pass return df References None\nCode The jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2025-02-02T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2025/02/02/reusable-extensible-python-functions-financial-data-analysis/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2025/02/02/reusable-extensible-python-functions-financial-data-analysis/","title":"Reusable And Extensible Python Functions For Financial Data Analysis"},{"content":"Python Module Management As an Arch Linux user, the push is to utilize pacman and related tools to manage dependencies and package updates (including Python modules). In fact, the wiki itself explicitly states this (see 2.1), and the default Arch installation of Python disables python-pip.\nUnfortunately, there are limited resources put into maintaining packages for modules and only the most common and popular modules are maintained, and they are updated promptly as is consistent within the Arch ecosystem.\nCreating A Virtual Environment After recently delving into crypto and the web3 Python module, the Coinbase API, and others, I\u0026rsquo;ve found the need to install Python modules from Pypi, the Python package index. This is the most exhaustive location to find modules, including the latest updates and version history.\nUsing python-pip necessitated the use of virtual environments, which made me reconsider the idea of not maintaining Python modules (or maintaining very few) through pacman at all.\nI chose to place the virtual environments at ~/python-virtual-envs/ and within that directory have one called general and other called wrds. The wrds environment is specific to the Wharton Research Data Services which requires (for some reason) an older package of nympy.\nThe \u0026ldquo;general\u0026rdquo; environment covers everything else. I created it with the usual command:\n$ python -m venv ~/python-virtual-envs/general Once created, it can be activated (either in a terminal or an IDE such as VS Code) by executing the following in the terminal:\n$ source ~/python-virtual-envs/general/bin/activate Creating Version Specific Python Virtual Environments If a specific version of python is required (vs the version installed on the base Arch system), it can be installed as follows:\n$ sudo yay python312 And then follow the requisite prompts to install. Note that I am using yay, with the binary build yay-bin.\nOnce that completes, the virtual environment can be installed as follows:\n$ python3.12 -m venv ~/python-virtual-envs/general_312 The virtual environment can then be activated in a similar manner as any other:\n$ source ~/python-virtual-envs/general_312/bin/activate Using python-pip After the virtual environment is created and activated, modules can be installed by using python-pip, such as:\n$ pip install \u0026lt;module-name\u0026gt; If you want to view all installed modules, run:\n$ pip list Or the outdated modules:\n$ pip list --outdated And updated at a later point in time with:\n$ pip install --upgrade \u0026lt;module-name\u0026gt; If you are interested in the specifics of a module (name, version, location, etc.):\n$ pip show \u0026lt;module-name\u0026gt; Using A requirements.txt File If you have a requirements.txt file present in a git repository/directory, such as:\nYou can install the required dependencies with the following command:\n$ pip install -r requirements.txt pip will then install all the required package and module versions based on the requirements file.\nMaintaining Across Multiple Systems To avoid having to redundantly install modules on different systems, after I make a change to the virtual environment I can zip the entire ~/python-virtual-envs/ directory (or any of the individual directories of the virtual environments) and upload the zip file to Dropbox. This takes only a few minutes, and if I am working on a different system can simply extract the archive and have a completely up-to-date and current virtual environment to work in.\nReferences https://docs.python.org/3/library/venv.html https://pypi.org/ https://note.nkmk.me/en/python-pip-usage/ https://wiki.archlinux.org/title/Python https://github.com/Jguer/yay ","date":"2024-12-02T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2024/12/02/using-python-virtual-environments/using-python-virtual-environments_3_final.jpg","permalink":"https://www.jaredszajkowski.com/stack/2024/12/02/using-python-virtual-environments/","title":"Using Python Virtual Environments"},{"content":" Introduction Harry Browne was an influencial politician, financial advisor, and author who lived from 1933 to 2006 and published 12 books. Wikipedia has an in-depth biography on him.\nWithin the world of finance and investing, one of his best known works is Fail-Safe Investing: Lifelong Financial Security in 30 Minutes. In it, he introduces the idea of the \u0026ldquo;Permanent Portfolio\u0026rdquo;, an investment strategy that uses only four assets and is very simple to implement.\nIn this post, we will investigate Browne\u0026rsquo;s suggested portfolio, including performance across various market cycles and economic regimes.\nBrowne\u0026rsquo;s Portfolio Requirements In Fail-Safe Investing, under rule #11, Browne lays out the requirements for a \u0026ldquo;bulletproof portfolio\u0026rdquo; that will \u0026ldquo;assure that your wealth will survive any event - including events that would be devastating to any one investment. In other words, this portfolio should protect you no matter what the future brings.\u0026rdquo;\nHis requirements for the portfolio consist of the followng:\nSafety: Protection again any economic future, including \u0026ldquo;inflation, recession, or even depression\u0026rdquo; Stability: Performance should be consistent so that you will not need to make any changes and will not experience significant drawdowns Simplicity: Easy to implement and take very little time to maintain He then describes the four \u0026ldquo;broad movements\u0026rdquo; of the economy:\nProsperity: The economy is growing, business is doing well, interest rates are usually low Inflation: The cost of goods and services is rising Tight money or recession: The money supply is shrinking, economic activity is slowing Deflation: Prices are declining and the value of money is increasing The Permanent Portfolio Browne then matches an asset class to each of the economic conditions above:\nProsperity -\u0026gt; Stocks (due to prosperity) and long term bonds (when interest rates fall) Inflation -\u0026gt; Gold Deflation -\u0026gt; Long term bonds (when interest rates fall) Tight money -\u0026gt; Cash He completes the Permanent Portfolio by stipulating the following:\nStart with a base allocation of 25% to each of the asset classes (stocks, bonds, gold, cash) Rebalance back to the base allocation annually, or when \u0026ldquo;any of the four investments has become worth less than 15%, or more than 35%, of the portfolio\u0026rsquo;s overall value\u0026rdquo;Note: Browne does not specify when the portfolio should be rebalanced; therefore, we will make an assumption of a January 1st rebalance. Data For this exercise, we will use the following asset classes:\nStocks: S\u0026amp;P 500 (SPXT_S\u0026amp;P 500 Total Return Index) Bonds: 10 Year US Treasuries (SPBDU10T_S\u0026amp;P US Treasury Bond 7-10 Year Total Return Index) Gold: Gold Spot Price (XAU_Gold USD Spot) Cash: USD With the exception of cash, all data is sourced from Bloomberg.\nWe could use ETFs, but the available price history for the ETFs is much shorter than the indices above. If we wanted to use ETFs, the following would work:\nStocks: IVV - iShares Core S\u0026amp;P 500 ETF Bonds: IEF - iShares 7-10 Year Treasury Bond ETF Gold: IAU - iShares Gold Trust Cash: USD Python Functions Here are the functions needed for this project:\nbb_clean_data: Takes an Excel export from Bloomberg, removes the miscellaneous headings/rows, and returns a DataFrame. df_info: A simple function to display the information about a DataFrame and the first five rows and last five rows. df_info_markdown: Similar to the df_info function above, except that it coverts the output to markdown. export_track_md_deps: Exports various text outputs to markdown files, which are included in the index.md file created when building the site with Hugo. load_data: Load data from a CSV, Excel, or Pickle file into a pandas DataFrame. pandas_set_decimal_places: Set the number of decimal places displayed for floating-point numbers in pandas. strategy_harry_brown_perm_port: Execute the strategy for the Harry Brown permanent portfolio. summary_stats: Generate summary statistics for a series of returns. Data Overview Load Data As previously mentioned, the data for this exercise comes primarily from Bloomberg. We\u0026rsquo;ll start with loading the data first for bonds:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 # Set decimal places pandas_set_decimal_places(3) # Bonds dataframe bb_clean_data( base_directory=DATA_DIR, fund_ticker_name=\u0026#34;SPBDU10T_S\u0026amp;P US Treasury Bond 7-10 Year Total Return Index\u0026#34;, source=\u0026#34;Bloomberg\u0026#34;, asset_class=\u0026#34;Indices\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) bonds_data = load_data( base_directory=DATA_DIR, ticker=\u0026#34;SPBDU10T_S\u0026amp;P US Treasury Bond 7-10 Year Total Return Index_Clean\u0026#34;, source=\u0026#34;Bloomberg\u0026#34;, asset_class=\u0026#34;Indices\u0026#34;, timeframe=\u0026#34;Daily\u0026#34;, ) bonds_data[\u0026#39;Date\u0026#39;] = pd.to_datetime(bonds_data[\u0026#39;Date\u0026#39;]) bonds_data.set_index(\u0026#39;Date\u0026#39;, inplace = True) bonds_data = bonds_data[(bonds_data.index \u0026gt;= \u0026#39;1990-01-01\u0026#39;) \u0026amp; (bonds_data.index \u0026lt;= \u0026#39;2023-12-31\u0026#39;)] bonds_data.rename(columns={\u0026#39;Close\u0026#39;:\u0026#39;Bonds_Close\u0026#39;}, inplace=True) bonds_data[\u0026#39;Bonds_Daily_Return\u0026#39;] = bonds_data[\u0026#39;Bonds_Close\u0026#39;].pct_change() bonds_data[\u0026#39;Bonds_Total_Return\u0026#39;] = (1 + bonds_data[\u0026#39;Bonds_Daily_Return\u0026#39;]).cumprod() display(bonds_data.head()) The following is the output:\nDate Bonds_Close Bonds_Daily_Return Bonds_Total_Return 1990-01-02 00:00:00 99.972 nan nan 1990-01-03 00:00:00 99.733 -0.002 0.998 1990-01-04 00:00:00 99.813 0.001 0.998 1990-01-05 00:00:00 99.769 -0.000 0.998 1990-01-08 00:00:00 99.681 -0.001 0.997 Then for stocks:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 # Stocks dataframe bb_clean_data( base_directory=DATA_DIR, fund_ticker_name=\u0026#34;SPXT_S\u0026amp;P 500 Total Return Index\u0026#34;, source=\u0026#34;Bloomberg\u0026#34;, asset_class=\u0026#34;Indices\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) stocks_data = load_data( base_directory=DATA_DIR, ticker=\u0026#34;SPXT_S\u0026amp;P 500 Total Return Index_Clean\u0026#34;, source=\u0026#34;Bloomberg\u0026#34;, asset_class=\u0026#34;Indices\u0026#34;, timeframe=\u0026#34;Daily\u0026#34;, ) stocks_data[\u0026#39;Date\u0026#39;] = pd.to_datetime(stocks_data[\u0026#39;Date\u0026#39;]) stocks_data.set_index(\u0026#39;Date\u0026#39;, inplace = True) stocks_data = stocks_data[(stocks_data.index \u0026gt;= \u0026#39;1990-01-01\u0026#39;) \u0026amp; (stocks_data.index \u0026lt;= \u0026#39;2023-12-31\u0026#39;)] stocks_data.rename(columns={\u0026#39;Close\u0026#39;:\u0026#39;Stocks_Close\u0026#39;}, inplace=True) stocks_data[\u0026#39;Stocks_Daily_Return\u0026#39;] = stocks_data[\u0026#39;Stocks_Close\u0026#39;].pct_change() stocks_data[\u0026#39;Stocks_Total_Return\u0026#39;] = (1 + stocks_data[\u0026#39;Stocks_Daily_Return\u0026#39;]).cumprod() display(stocks_data.head()) The following is the output:\nDate Stocks_Close Stocks_Daily_Return Stocks_Total_Return 1990-01-01 00:00:00 nan nan nan 1990-01-02 00:00:00 386.160 nan nan 1990-01-03 00:00:00 385.170 -0.003 0.997 1990-01-04 00:00:00 382.020 -0.008 0.989 1990-01-05 00:00:00 378.300 -0.010 0.980 And finally, gold:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 # Gold dataframe bb_clean_data( base_directory=DATA_DIR, fund_ticker_name=\u0026#34;XAU_Gold USD Spot\u0026#34;, source=\u0026#34;Bloomberg\u0026#34;, asset_class=\u0026#34;Commodities\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) gold_data = load_data( base_directory=DATA_DIR, ticker=\u0026#34;XAU_Gold USD Spot_Clean\u0026#34;, source=\u0026#34;Bloomberg\u0026#34;, asset_class=\u0026#34;Commodities\u0026#34;, timeframe=\u0026#34;Daily\u0026#34;, ) gold_data[\u0026#39;Date\u0026#39;] = pd.to_datetime(gold_data[\u0026#39;Date\u0026#39;]) gold_data.set_index(\u0026#39;Date\u0026#39;, inplace = True) gold_data = gold_data[(gold_data.index \u0026gt;= \u0026#39;1990-01-01\u0026#39;) \u0026amp; (gold_data.index \u0026lt;= \u0026#39;2023-12-31\u0026#39;)] gold_data.rename(columns={\u0026#39;Close\u0026#39;:\u0026#39;Gold_Close\u0026#39;}, inplace=True) gold_data[\u0026#39;Gold_Daily_Return\u0026#39;] = gold_data[\u0026#39;Gold_Close\u0026#39;].pct_change() gold_data[\u0026#39;Gold_Total_Return\u0026#39;] = (1 + gold_data[\u0026#39;Gold_Daily_Return\u0026#39;]).cumprod() display(gold_data.head()) The following is the output:\nDate Gold_Close Gold_Daily_Return Gold_Total_Return 1990-01-02 00:00:00 399.000 nan nan 1990-01-03 00:00:00 395.000 -0.010 0.990 1990-01-04 00:00:00 396.500 0.004 0.994 1990-01-05 00:00:00 405.000 0.021 1.015 1990-01-08 00:00:00 404.600 -0.001 1.014 Combine Data We\u0026rsquo;ll now combine the dataframes for the timeseries data from each of the asset classes, as follows:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Merge the stock data and bond data into a single DataFrame using their indices (dates) perm_port = pd.merge(stocks_data[\u0026#39;Stocks_Close\u0026#39;], bonds_data[\u0026#39;Bonds_Close\u0026#39;], left_index=True, right_index=True) # Add gold data to the portfolio DataFrame by merging it with the existing data on indices (dates) perm_port = pd.merge(perm_port, gold_data[\u0026#39;Gold_Close\u0026#39;], left_index=True, right_index=True) # Add a column for cash with a constant value of 1 (assumes the value of cash remains constant at $1 over time) perm_port[\u0026#39;Cash_Close\u0026#39;] = 1 # Remove any rows with missing values (NaN) to ensure clean data for further analysis perm_port.dropna(inplace=True) # Display the finalized portfolio DataFrame display(perm_port) Check For Missing Values We can check for any missing (NaN) values in each column:\n1 2 # Check for any missing values in each column perm_port.isnull().any() DataFrame Info Now, running:\n1 df_info(perm_port) Gives us the following:\n1 2 3 4 5 6 7 8 9 10 11 12 13 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 8479 entries, 1990-01-02 to 2023-12-29 Data columns (total 4 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Stocks_Close 8479 non-null float64 1 Bonds_Close 8479 non-null float64 2 Gold_Close 8479 non-null float64 3 Cash_Close 8479 non-null int64 dtypes: float64(3), int64(1) memory usage: 331.2 KB The first 5 rows are:\nDate Stocks_Close Bonds_Close Gold_Close Cash_Close 1990-01-02 00:00:00 386.16 99.97 399.00 1.00 1990-01-03 00:00:00 385.17 99.73 395.00 1.00 1990-01-04 00:00:00 382.02 99.81 396.50 1.00 1990-01-05 00:00:00 378.30 99.77 405.00 1.00 1990-01-08 00:00:00 380.04 99.68 404.60 1.00 The last 5 rows are:\nDate Stocks_Close Bonds_Close Gold_Close Cash_Close 2023-12-22 00:00:00 10292.37 604.17 2053.08 1.00 2023-12-26 00:00:00 10335.98 604.55 2067.81 1.00 2023-12-27 00:00:00 10351.60 609.36 2077.49 1.00 2023-12-28 00:00:00 10356.59 606.83 2065.61 1.00 2023-12-29 00:00:00 10327.83 606.18 2062.98 1.00 We can see that we have daily close price data for all 4 asset classes from the beginning of 1990 to the end of 2023.\nExecute Strategy Using an annual rebalance date of January 1, we\u0026rsquo;ll now execute the strategy with the following code:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 # List of funds to be used fund_list = [\u0026#39;Stocks\u0026#39;, \u0026#39;Bonds\u0026#39;, \u0026#39;Gold\u0026#39;, \u0026#39;Cash\u0026#39;] # Starting cash contribution starting_cash = 10000 # Monthly cash contribution cash_contrib = 0 strat = strategy_harry_brown_perm_port( fund_list=fund_list, starting_cash=starting_cash, cash_contrib=cash_contrib, close_prices_df=perm_port, rebal_month=1, rebal_day=1, rebal_per_high=0.35, rebal_per_low=0.15, excel_export=True, pickle_export=True, output_confirmation=True, ) strat = strat.set_index(\u0026#39;Date\u0026#39;) This returns a dataframe with the entire strategy.\nRunning:\n1 df_info(strat) Gives us:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 The columns, shape, and data types are: \u0026lt;class \u0026#39;pandas.core.frame.DataFrame\u0026#39;\u0026gt; DatetimeIndex: 8479 entries, 1990-01-02 to 2023-12-29 Data columns (total 34 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Stocks_Close 8479 non-null float64 1 Bonds_Close 8479 non-null float64 2 Gold_Close 8479 non-null float64 3 Cash_Close 8479 non-null int64 4 Stocks_BA_Shares 8479 non-null float64 5 Stocks_BA_$_Invested 8479 non-null float64 6 Stocks_BA_Port_% 8479 non-null float64 7 Bonds_BA_Shares 8479 non-null float64 8 Bonds_BA_$_Invested 8479 non-null float64 9 Bonds_BA_Port_% 8479 non-null float64 10 Gold_BA_Shares 8479 non-null float64 11 Gold_BA_$_Invested 8479 non-null float64 12 Gold_BA_Port_% 8479 non-null float64 13 Cash_BA_Shares 8479 non-null float64 14 Cash_BA_$_Invested 8479 non-null float64 15 Cash_BA_Port_% 8479 non-null float64 16 Total_BA_$_Invested 8479 non-null float64 17 Contribution 8479 non-null int64 18 Rebalance 8479 non-null object 19 Stocks_AA_Shares 8479 non-null float64 20 Stocks_AA_$_Invested 8479 non-null float64 21 Stocks_AA_Port_% 8479 non-null float64 22 Bonds_AA_Shares 8479 non-null float64 23 Bonds_AA_$_Invested 8479 non-null float64 24 Bonds_AA_Port_% 8479 non-null float64 25 Gold_AA_Shares 8479 non-null float64 26 Gold_AA_$_Invested 8479 non-null float64 27 Gold_AA_Port_% 8479 non-null float64 28 Cash_AA_Shares 8479 non-null float64 29 Cash_AA_$_Invested 8479 non-null float64 30 Cash_AA_Port_% 8479 non-null float64 31 Total_AA_$_Invested 8479 non-null float64 32 Return 8478 non-null float64 33 Cumulative_Return 8478 non-null float64 dtypes: float64(31), int64(2), object(1) memory usage: 2.3+ MB The first 5 rows are:\nDate Stocks_Close Bonds_Close Gold_Close Cash_Close Stocks_BA_Shares Stocks_BA_$_Invested Stocks_BA_Port_% Bonds_BA_Shares Bonds_BA_$_Invested Bonds_BA_Port_% Gold_BA_Shares Gold_BA_$_Invested Gold_BA_Port_% Cash_BA_Shares Cash_BA_$_Invested Cash_BA_Port_% Total_BA_$_Invested Contribution Rebalance Stocks_AA_Shares Stocks_AA_$_Invested Stocks_AA_Port_% Bonds_AA_Shares Bonds_AA_$_Invested Bonds_AA_Port_% Gold_AA_Shares Gold_AA_$_Invested Gold_AA_Port_% Cash_AA_Shares Cash_AA_$_Invested Cash_AA_Port_% Total_AA_$_Invested Return Cumulative_Return 1990-01-02 00:00:00 386.16 99.97 399.00 1 6.47 2500.00 0.25 25.01 2500.00 0.25 6.27 2500.00 0.25 2500.00 2500.00 0.25 10000.00 0 No 6.47 2500.00 0.25 25.01 2500.00 0.25 6.27 2500.00 0.25 2500.00 2500.00 0.25 10000.00 nan nan 1990-01-03 00:00:00 385.17 99.73 395.00 1 6.47 2493.59 0.25 25.01 2494.02 0.25 6.27 2474.94 0.25 2500.00 2500.00 0.25 9962.55 0 No 6.47 2493.59 0.25 25.01 2494.02 0.25 6.27 2474.94 0.25 2500.00 2500.00 0.25 9962.55 -0.00 1.00 1990-01-04 00:00:00 382.02 99.81 396.50 1 6.47 2473.20 0.25 25.01 2496.02 0.25 6.27 2484.34 0.25 2500.00 2500.00 0.25 9953.56 0 No 6.47 2473.20 0.25 25.01 2496.02 0.25 6.27 2484.34 0.25 2500.00 2500.00 0.25 9953.56 -0.00 1.00 1990-01-05 00:00:00 378.30 99.77 405.00 1 6.47 2449.11 0.25 25.01 2494.92 0.25 6.27 2537.59 0.25 2500.00 2500.00 0.25 9981.63 0 No 6.47 2449.11 0.25 25.01 2494.92 0.25 6.27 2537.59 0.25 2500.00 2500.00 0.25 9981.63 0.00 1.00 1990-01-08 00:00:00 380.04 99.68 404.60 1 6.47 2460.38 0.25 25.01 2492.72 0.25 6.27 2535.09 0.25 2500.00 2500.00 0.25 9988.19 0 No 6.47 2460.38 0.25 25.01 2492.72 0.25 6.27 2535.09 0.25 2500.00 2500.00 0.25 9988.19 0.00 1.00 The last 5 rows are:\nDate Stocks_Close Bonds_Close Gold_Close Cash_Close Stocks_BA_Shares Stocks_BA_$_Invested Stocks_BA_Port_% Bonds_BA_Shares Bonds_BA_$_Invested Bonds_BA_Port_% Gold_BA_Shares Gold_BA_$_Invested Gold_BA_Port_% Cash_BA_Shares Cash_BA_$_Invested Cash_BA_Port_% Total_BA_$_Invested Contribution Rebalance Stocks_AA_Shares Stocks_AA_$_Invested Stocks_AA_Port_% Bonds_AA_Shares Bonds_AA_$_Invested Bonds_AA_Port_% Gold_AA_Shares Gold_AA_$_Invested Gold_AA_Port_% Cash_AA_Shares Cash_AA_$_Invested Cash_AA_Port_% Total_AA_$_Invested Return Cumulative_Return 2023-12-22 00:00:00 10292.37 604.17 2053.08 1 1.81 18595.87 0.29 25.03 15124.46 0.23 8.00 16426.12 0.25 14717.17 14717.17 0.23 64863.62 0 No 1.81 18595.87 0.29 25.03 15124.46 0.23 8.00 16426.12 0.25 14717.17 14717.17 0.23 64863.62 0.00 6.49 2023-12-26 00:00:00 10335.98 604.55 2067.81 1 1.81 18674.66 0.29 25.03 15134.20 0.23 8.00 16543.97 0.25 14717.17 14717.17 0.23 65070.01 0 No 1.81 18674.66 0.29 25.03 15134.20 0.23 8.00 16543.97 0.25 14717.17 14717.17 0.23 65070.01 0.00 6.51 2023-12-27 00:00:00 10351.60 609.36 2077.49 1 1.81 18702.89 0.29 25.03 15254.36 0.23 8.00 16621.42 0.25 14717.17 14717.17 0.23 65295.84 0 No 1.81 18702.89 0.29 25.03 15254.36 0.23 8.00 16621.42 0.25 14717.17 14717.17 0.23 65295.84 0.00 6.53 2023-12-28 00:00:00 10356.59 606.83 2065.61 1 1.81 18711.90 0.29 25.03 15191.10 0.23 8.00 16526.37 0.25 14717.17 14717.17 0.23 65146.54 0 No 1.81 18711.90 0.29 25.03 15191.10 0.23 8.00 16526.37 0.25 14717.17 14717.17 0.23 65146.54 -0.00 6.51 2023-12-29 00:00:00 10327.83 606.18 2062.98 1 1.81 18659.94 0.29 25.03 15175.01 0.23 8.00 16505.33 0.25 14717.17 14717.17 0.23 65057.44 0 No 1.81 18659.94 0.29 25.03 15175.01 0.23 8.00 16505.33 0.25 14717.17 14717.17 0.23 65057.44 -0.00 6.51 From the above, we can see that there are all columns for before/after re-balancing, including the shares, asset values, percentages, etc. for the four different asset classes.\nStrategy Statistics Let\u0026rsquo;s look at the summary statistics for the entire timeframe, as well as several different ranges:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 sum_stats = summary_stats( fund_list=fund_list, df=strat[[\u0026#39;Return\u0026#39;]], period=\u0026#34;Daily\u0026#34;, excel_export=True, pickle_export=True, output_confirmation=True, ) strat_pre_1999 = strat[strat.index \u0026lt; \u0026#39;2000-01-01\u0026#39;] sum_stats_pre_1999 = summary_stats( fund_list=fund_list, df=strat_pre_1999[[\u0026#39;Return\u0026#39;]], period=\u0026#34;Daily\u0026#34;, excel_export=False, pickle_export=False, output_confirmation=True, ) strat_post_1999 = strat[strat.index \u0026gt;= \u0026#39;2000-01-01\u0026#39;] sum_stats_post_1999 = summary_stats( fund_list=fund_list, df=strat_post_1999[[\u0026#39;Return\u0026#39;]], period=\u0026#34;Daily\u0026#34;, excel_export=False, pickle_export=False, output_confirmation=True, ) strat_post_2009 = strat[strat.index \u0026gt;= \u0026#39;2010-01-01\u0026#39;] sum_stats_post_2009 = summary_stats( fund_list=fund_list, df=strat_post_2009[[\u0026#39;Return\u0026#39;]], period=\u0026#34;Daily\u0026#34;, excel_export=False, pickle_export=False, output_confirmation=True, ) And the concat them to make comparing them easier:\n1 2 3 4 5 6 7 8 9 all_sum_stats = pd.concat([sum_stats]) all_sum_stats = all_sum_stats.rename(index={\u0026#39;Return\u0026#39;: \u0026#39;1990 - 2023\u0026#39;}) all_sum_stats = pd.concat([all_sum_stats, sum_stats_pre_1999]) all_sum_stats = all_sum_stats.rename(index={\u0026#39;Return\u0026#39;: \u0026#39;Pre 1999\u0026#39;}) all_sum_stats = pd.concat([all_sum_stats, sum_stats_post_1999]) all_sum_stats = all_sum_stats.rename(index={\u0026#39;Return\u0026#39;: \u0026#39;Post 1999\u0026#39;}) all_sum_stats = pd.concat([all_sum_stats, sum_stats_post_2009]) all_sum_stats = all_sum_stats.rename(index={\u0026#39;Return\u0026#39;: \u0026#39;Post 2009\u0026#39;}) display(all_sum_stats) Which gives us:\nAnnualized Mean Annualized Volatility Annualized Sharpe Ratio CAGR Daily Max Return Daily Max Return (Date) Daily Min Return Daily Min Return (Date) Max Drawdown Peak Bottom Recovery Date 1990 - 2023 0.057 0.060 0.957 0.057 0.029 2020-03-24 00:00:00 -0.030 2020-03-12 00:00:00 -0.154 2008-03-18 00:00:00 2008-11-12 00:00:00 2009-10-06 00:00:00 Pre 1999 0.060 0.050 1.207 0.061 0.022 1999-09-28 00:00:00 -0.018 1993-08-05 00:00:00 -0.062 1998-07-20 00:00:00 1998-08-31 00:00:00 1998-11-05 00:00:00 Post 1999 0.056 0.064 0.883 0.056 0.029 2020-03-24 00:00:00 -0.030 2020-03-12 00:00:00 -0.154 2008-03-18 00:00:00 2008-11-12 00:00:00 2009-10-06 00:00:00 Post 2009 0.056 0.060 0.927 0.056 0.029 2020-03-24 00:00:00 -0.030 2020-03-12 00:00:00 -0.127 2021-12-27 00:00:00 2022-10-20 00:00:00 2023-12-01 00:00:00 Annual Returns Here\u0026rsquo;s the annual returns:\nYear Return 1991 0.102 1992 0.030 1993 0.099 1994 -0.017 1995 0.153 1996 0.049 1997 0.056 1998 0.102 1999 0.039 2000 0.000 2001 -0.005 2002 0.043 2003 0.121 2004 0.051 2005 0.064 2006 0.104 2007 0.117 2008 -0.033 2009 0.107 2010 0.137 2011 0.070 2012 0.068 2013 -0.006 2014 0.052 2015 -0.018 2016 0.052 2017 0.095 2018 -0.012 2019 0.145 2020 0.134 2021 0.057 2022 -0.082 2023 0.109 Since the strategy, summary statistics, and annual returns are all exported as excel files, they can be found at the following locations:\nStocks_Bonds_Gold_Cash_Strategy.xlsx Stocks_Bonds_Gold_Cash_Summary_Stats.xlsx Stocks_Bonds_Gold_Cash_Annual_Returns.xlsx Next we will look at some plots to help visualize the data.\nGenerate Plots Here are the various functions needed for the plots:\nPlot Cumulative Return Plot cumulative return:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 def plot_cumulative_return(strat_df): # Generate plot plt.figure(figsize=(10, 5), facecolor = \u0026#39;#F5F5F5\u0026#39;) # Plotting data plt.plot(strat_df.index, strat_df[\u0026#39;Cumulative_Return\u0026#39;], label = \u0026#39;Strategy Cumulative Return\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;green\u0026#39;, linewidth=1) # Set X axis # x_tick_spacing = 5 # Specify the interval for x-axis ticks # plt.gca().xaxis.set_major_locator(MultipleLocator(x_tick_spacing)) plt.gca().xaxis.set_major_locator(mdates.YearLocator()) plt.gca().xaxis.set_major_formatter(mdates.DateFormatter(\u0026#39;%Y\u0026#39;)) plt.xlabel(\u0026#39;Year\u0026#39;, fontsize = 9) plt.xticks(rotation = 45, fontsize = 7) # plt.xlim(, ) # Set Y axis y_tick_spacing = 0.5 # Specify the interval for y-axis ticks plt.gca().yaxis.set_major_locator(MultipleLocator(y_tick_spacing)) plt.ylabel(\u0026#39;Cumulative Return\u0026#39;, fontsize = 9) plt.yticks(fontsize = 7) plt.ylim(0, 7.5) # Set title, etc. plt.title(\u0026#39;Cumulative Return\u0026#39;, fontsize = 12) # Set the grid \u0026amp; legend plt.tight_layout() plt.grid(True) plt.legend(fontsize=8) # Save the figure plt.savefig(\u0026#39;03_Cumulative_Return.png\u0026#39;, dpi=300, bbox_inches=\u0026#39;tight\u0026#39;) # Display the plot return plt.show() Plot Portfolio Values Plot portfolio values:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 def plot_values(strat_df): # Generate plot plt.figure(figsize=(10, 5), facecolor = \u0026#39;#F5F5F5\u0026#39;) # Plotting data plt.plot(strat_df.index, strat_df[\u0026#39;Total_AA_$_Invested\u0026#39;], label=\u0026#39;Total Portfolio Value\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;black\u0026#39;, linewidth=1) plt.plot(strat_df.index, strat_df[\u0026#39;Stocks_AA_$_Invested\u0026#39;], label=\u0026#39;Stocks Position Value\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;orange\u0026#39;, linewidth=1) plt.plot(strat_df.index, strat_df[\u0026#39;Bonds_AA_$_Invested\u0026#39;], label=\u0026#39;Bond Position Value\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;yellow\u0026#39;, linewidth=1) plt.plot(strat_df.index, strat_df[\u0026#39;Gold_AA_$_Invested\u0026#39;], label=\u0026#39;Gold Position Value\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;blue\u0026#39;, linewidth=1) plt.plot(strat_df.index, strat_df[\u0026#39;Cash_AA_$_Invested\u0026#39;], label=\u0026#39;Cash Position Value\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;brown\u0026#39;, linewidth=1) # Set X axis # x_tick_spacing = 5 # Specify the interval for x-axis ticks # plt.gca().xaxis.set_major_locator(MultipleLocator(x_tick_spacing)) plt.gca().xaxis.set_major_locator(mdates.YearLocator()) plt.gca().xaxis.set_major_formatter(mdates.DateFormatter(\u0026#39;%Y\u0026#39;)) plt.xlabel(\u0026#39;Year\u0026#39;, fontsize = 9) plt.xticks(rotation = 45, fontsize = 7) # plt.xlim(, ) # Set Y axis y_tick_spacing = 5000 # Specify the interval for y-axis ticks plt.gca().yaxis.set_major_locator(MultipleLocator(y_tick_spacing)) plt.gca().yaxis.set_major_formatter(mtick.FuncFormatter(lambda x, pos: \u0026#39;{:,.0f}\u0026#39;.format(x))) # Adding commas to y-axis labels plt.ylabel(\u0026#39;Total Value ($)\u0026#39;, fontsize = 9) plt.yticks(fontsize = 7) plt.ylim(0, 75000) # Set title, etc. plt.title(\u0026#39;Total Values For Stocks, Bonds, Gold, and Cash Positions and Portfolio\u0026#39;, fontsize = 12) # Set the grid \u0026amp; legend plt.tight_layout() plt.grid(True) plt.legend(fontsize=8) # Save the figure plt.savefig(\u0026#39;04_Portfolio_Values.png\u0026#39;, dpi=300, bbox_inches=\u0026#39;tight\u0026#39;) # Display the plot return plt.show() Plot Portfolio Drawdown Plot portfolio drawdown:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 def plot_drawdown(strat_df): rolling_max = strat_df[\u0026#39;Total_AA_$_Invested\u0026#39;].cummax() drawdown = (strat_df[\u0026#39;Total_AA_$_Invested\u0026#39;] - rolling_max) / rolling_max * 100 # Generate plot plt.figure(figsize=(10, 5), facecolor = \u0026#39;#F5F5F5\u0026#39;) # Plotting data plt.plot(strat_df.index, drawdown, label=\u0026#39;Drawdown\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;red\u0026#39;, linewidth=1) # Set X axis # x_tick_spacing = 5 # Specify the interval for x-axis ticks # plt.gca().xaxis.set_major_locator(MultipleLocator(x_tick_spacing)) plt.gca().xaxis.set_major_locator(mdates.YearLocator()) plt.gca().xaxis.set_major_formatter(mdates.DateFormatter(\u0026#39;%Y\u0026#39;)) plt.xlabel(\u0026#39;Year\u0026#39;, fontsize = 9) plt.xticks(rotation = 45, fontsize = 7) # plt.xlim(, ) # Set Y axis y_tick_spacing = 1 # Specify the interval for y-axis ticks plt.gca().yaxis.set_major_locator(MultipleLocator(y_tick_spacing)) # plt.gca().yaxis.set_major_formatter(mtick.FuncFormatter(lambda x, pos: \u0026#39;{:,.0f}\u0026#39;.format(x))) # Adding commas to y-axis labels plt.gca().yaxis.set_major_formatter(mtick.FuncFormatter(lambda x, pos: \u0026#39;{:.0f}\u0026#39;.format(x))) # Adding 0 decimal places to y-axis labels plt.ylabel(\u0026#39;Drawdown (%)\u0026#39;, fontsize = 9) plt.yticks(fontsize = 7) plt.ylim(-20, 0) # Set title, etc. plt.title(\u0026#39;Portfolio Drawdown\u0026#39;, fontsize = 12) # Set the grid \u0026amp; legend plt.tight_layout() plt.grid(True) plt.legend(fontsize=8) # Save the figure plt.savefig(\u0026#39;05_Portfolio_Drawdown.png\u0026#39;, dpi=300, bbox_inches=\u0026#39;tight\u0026#39;) # Display the plot return plt.show() Plot Portfolio Asset Weights Plot portfolio asset weights:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 def plot_asset_weights(strat_df): # Generate plot plt.figure(figsize=(10, 5), facecolor = \u0026#39;#F5F5F5\u0026#39;) # Plotting data plt.plot(strat_df.index, strat_df[\u0026#39;Stocks_AA_Port_%\u0026#39;] * 100, label=\u0026#39;Stocks Portfolio Weight\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;orange\u0026#39;, linewidth=1) plt.plot(strat_df.index, strat_df[\u0026#39;Bonds_AA_Port_%\u0026#39;] * 100, label=\u0026#39;Bonds Portfolio Weight\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;yellow\u0026#39;, linewidth=1) plt.plot(strat_df.index, strat_df[\u0026#39;Gold_AA_Port_%\u0026#39;] * 100, label=\u0026#39;Gold Portfolio Weight\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;blue\u0026#39;, linewidth=1) plt.plot(strat_df.index, strat_df[\u0026#39;Cash_AA_Port_%\u0026#39;] * 100, label=\u0026#39;Cash Portfolio Weight\u0026#39;, linestyle=\u0026#39;-\u0026#39;, color=\u0026#39;brown\u0026#39;, linewidth=1) # Set X axis # x_tick_spacing = 5 # Specify the interval for x-axis ticks # plt.gca().xaxis.set_major_locator(MultipleLocator(x_tick_spacing)) plt.gca().xaxis.set_major_locator(mdates.YearLocator()) plt.gca().xaxis.set_major_formatter(mdates.DateFormatter(\u0026#39;%Y\u0026#39;)) plt.xlabel(\u0026#39;Year\u0026#39;, fontsize = 9) plt.xticks(rotation = 45, fontsize = 7) # plt.xlim(, ) # Set Y axis y_tick_spacing = 2 # Specify the interval for y-axis ticks plt.gca().yaxis.set_major_locator(MultipleLocator(y_tick_spacing)) # plt.gca().yaxis.set_major_formatter(mtick.FuncFormatter(lambda x, pos: \u0026#39;{:,.0f}\u0026#39;.format(x))) # Adding commas to y-axis labels plt.ylabel(\u0026#39;Asset Weight (%)\u0026#39;, fontsize = 9) plt.yticks(fontsize = 7) plt.ylim(14, 36) # Set title, etc. plt.title(\u0026#39;Portfolio Asset Weights For Stocks, Bonds, Gold, and Cash Positions\u0026#39;, fontsize = 12) # Set the grid \u0026amp; legend plt.tight_layout() plt.grid(True) plt.legend(fontsize=8) # Save the figure plt.savefig(\u0026#39;07_Portfolio_Weights.png\u0026#39;, dpi=300, bbox_inches=\u0026#39;tight\u0026#39;) # Display the plot return plt.show() Execute plots:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 plot_cumulative_return(strat) plot_values(strat) plot_drawdown(strat) plot_asset_weights(strat) # Create dataframe for the annual returns strat_annual_returns = strat[\u0026#39;Cumulative_Return\u0026#39;].resample(\u0026#39;Y\u0026#39;).last().pct_change().dropna() strat_annual_returns_df = strat_annual_returns.to_frame() strat_annual_returns_df[\u0026#39;Year\u0026#39;] = strat_annual_returns_df.index.year # Add a \u0026#39;Year\u0026#39; column with just the year strat_annual_returns_df.reset_index(drop=True, inplace=True) # Reset the index to remove the datetime index # Now the DataFrame will have \u0026#39;Year\u0026#39; and \u0026#39;Cumulative_Return\u0026#39; columns strat_annual_returns_df = strat_annual_returns_df[[\u0026#39;Year\u0026#39;, \u0026#39;Cumulative_Return\u0026#39;]] # Keep only \u0026#39;Year\u0026#39; and \u0026#39;Cumulative_Return\u0026#39; columns strat_annual_returns_df.rename(columns = {\u0026#39;Cumulative_Return\u0026#39;:\u0026#39;Return\u0026#39;}, inplace=True) strat_annual_returns_df.set_index(\u0026#39;Year\u0026#39;, inplace=True) display(strat_annual_returns_df) plan_name = \u0026#39;_\u0026#39;.join(fund_list) file = plan_name + \u0026#34;_Annual_Returns.xlsx\u0026#34; location = file strat_annual_returns_df.to_excel(location, sheet_name=\u0026#39;data\u0026#39;) plot_annual_returns(strat_annual_returns_df) Here are several relevant plots:\nCumulative Return Portfolio Values (Total, Stocks, Bonds, Gold, and Cash) Here we can see the annual rebalancing taking effect with the values of the different asset classes. This can also be seen more clearly below.\nPortfolio Drawdown From this plot, we can see that the maximum drawdown came during the GFC; the drawdown during COVID was (interestingly) less than 10%.\nPortfolio Asset Weights The annual rebalancing appears to work effectively by selling assets that have increased in value and buying assets that have decreased in value over the previous year. Also note that there is only one instance when the weight of an asset fell to 15%. This occured for stocks during the GFC.\nPortfolio Annual Returns It\u0026rsquo;s interesting to see that there really aren\u0026rsquo;t any significant up or down years. Instead, it\u0026rsquo;s a steady climb without much volatility.\nSummary Overall, this is an interesting case study and Browne\u0026rsquo;s idea behind the Permanent Portfolio is certainly compelling. There might be more investigation to be done with respect to the following:\nInvestigate the extent to which the rebalancing date effects the portfolio performance Vary the weights of the asset classes to see if there is a meanful change in the results Experiment with leverage (i.e., simulating 1.2x leverage with a portfolio with weights of 30, 30, 30, 10 for stocks, bonds, gold, cash respectively.) Use ETFs instead of Bloomberg index data, and verify the results are similar. ETF data is much more available than the Bloomberg index. References Fail-Safe Investing: Lifelong Financial Security in 30 Minutes, by Harry Browne Code The jupyter notebook with the functions and all other code is available here. The html export of the jupyter notebook is available here. The pdf export of the jupyter notebook is available here.\n","date":"2024-11-04T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2024/11/04/harry-browne-permanent-portfolio/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2024/11/04/harry-browne-permanent-portfolio/","title":"Does Harry Browne's permanent portfolio withstand the test of time?"},{"content":"Post Updates Update 4/12/2025: Revised script to accomodate a list of excluded directories.\nIntroduction While there are numerous backup solutions available for Linux, many require extensive configuration and maintenance, and restoring from the backup is not always simple. Incremental backups are ideal because they maintain snapshots of the files and allow for access to previous versions of files.\nLinux Journal recently published an article on various backup solutions, and I thought I\u0026rsquo;d provide my incremental backup script that uses rsync and cp.\nIncremental backup script This script provides an incremental backup solution and only requires rsync and cp to be installed on the system.\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 #!/bin/bash # Define the directories to backup and their destination directories source_dir1=\u0026#34;/source1\u0026#34; backup_dir1=\u0026#34;/backup1/\u0026#34; source_dir2=\u0026#34;/source2\u0026#34; backup_dir2=\u0026#34;/backup2/\u0026#34; # Define excluded directories excluded_dir1=\u0026#34;leave/out/\u0026#34; excluded_dir2=\u0026#34;dont/want/\u0026#34; excluded_dir3=\u0026#34;exclude/this/\u0026#34; # Function to run a backup run_backup() { source_dir=$1 backup_dir=$2 # Check if the source directory exists if [ ! -d \u0026#34;$source_dir\u0026#34; ]; then echo \u0026#34;Error: Source directory not found\u0026#34; exit 1 fi # Input year and date echo \u0026#34;What is today\u0026#39;s year:\u0026#34; read backup_year echo \u0026#34;What is today\u0026#39;s date:\u0026#34; read backup_date # Check if the backup directory exists and run backup if [ -d \u0026#34;$backup_dir\u0026#34; ]; then echo \u0026#34;Backup directory found, backing up $source_dir\u0026#34; rsync -av --delete --exclude \u0026#34;$excluded_dir1\u0026#34; --exclude \u0026#34;$excluded_dir2\u0026#34; --exclude \u0026#34;$excluded_dir3\u0026#34; $source_dir $backup_dir/Monthly/ cp -al $backup_dir/Monthly/ $backup_dir/$backup_year/$backup_date/ else echo \u0026#34;Error: Backup directory not found\u0026#34; exit 1 fi } # Run backups run_backup $source_dir1 $backup_dir1 run_backup $source_dir2 $backup_dir2 # Output confirmation echo \u0026#34;Backup complete\u0026#34; Updated Incremental Backup Script Here\u0026rsquo;s the updated script, which now accomodates a list of excluded directories, along with a few other checks for the year and backup date.\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 #!/bin/bash # Define the directories to backup and their destination directories source_dir1=\u0026#34;/home/\u0026#34; backup_dir1=\u0026#34;/run/media/jared/Elements1/Backup\u0026#34; source_dir2=\u0026#34;/home/\u0026#34; backup_dir2=\u0026#34;/run/media/jared/Elements2/Backup\u0026#34; # Define excluded directories excluded_dirs=( \u0026#34;jared/Cloud_Storage/timeshift/\u0026#34; \u0026#34;jared/.cache/\u0026#34; \u0026#34;jared/.nv/\u0026#34; \u0026#34;jared/Cloud_Storage/Dropbox/.dropbox.cache/\u0026#34; ) # Function to run a backup run_backup() { local source_dir=\u0026#34;$1\u0026#34; local backup_dir=\u0026#34;$2\u0026#34; # Check if the source directory exists if [ ! -d \u0026#34;$source_dir\u0026#34; ]; then echo \u0026#34;Error: Source directory \u0026#39;$source_dir\u0026#39; not found.\u0026#34; exit 2 fi # Input year and date echo \u0026#34;What is today\u0026#39;s year (YYYY):\u0026#34; read -r backup_year if [[ ! \u0026#34;$backup_year\u0026#34; =~ ^[0-9]{4}$ ]]; then echo \u0026#34;Error: Invalid year entered.\u0026#34; exit 3 fi echo \u0026#34;What is today\u0026#39;s date (YYYY-MM-DD):\u0026#34; read -r backup_date if [[ ! \u0026#34;$backup_date\u0026#34; =~ ^[0-9]{4}-[0-9]{2}-[0-9]{2}$ ]]; then echo \u0026#34;Error: Invalid date format. Use YYYY-MM-DD.\u0026#34; exit 4 fi # Check if the backup directory exists and run backup if [ -d \u0026#34;$backup_dir\u0026#34; ]; then echo \u0026#34;Backup directory \u0026#39;$backup_dir\u0026#39; found, backing up \u0026#39;$source_dir\u0026#39;...\u0026#34; # Build rsync exclude arguments exclude_args=() for dir in \u0026#34;${excluded_dirs[@]}\u0026#34;; do exclude_args+=(--exclude \u0026#34;$dir\u0026#34;) done # Perform the backup rsync -av --delete \u0026#34;${exclude_args[@]}\u0026#34; \u0026#34;$source_dir\u0026#34; \u0026#34;$backup_dir/Monthly/home/\u0026#34; mkdir -p \u0026#34;$backup_dir/$backup_year/$backup_date/\u0026#34; cp -al \u0026#34;$backup_dir/Monthly/\u0026#34; \u0026#34;$backup_dir/$backup_year/$backup_date/\u0026#34; echo \u0026#34;Backup for \u0026#39;$source_dir\u0026#39; completed successfully.\u0026#34; else echo \u0026#34;Error: Backup directory \u0026#39;$backup_dir\u0026#39; not found.\u0026#34; exit 5 fi } # Run backups run_backup \u0026#34;$source_dir1\u0026#34; \u0026#34;$backup_dir1\u0026#34; run_backup \u0026#34;$source_dir2\u0026#34; \u0026#34;$backup_dir2\u0026#34; # Output confirmation echo \u0026#34;All backups completed successfully.\u0026#34; Let\u0026rsquo;s break this down line by line.\nSource and backup directories First, we need to define the source and backup directories, and any directories from the source that are to be excluded from the backup:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Define the directories to backup and their destination directories source_dir1=\u0026#34;/home/\u0026#34; backup_dir1=\u0026#34;/run/media/jared/Elements1/Backup\u0026#34; source_dir2=\u0026#34;/home/\u0026#34; backup_dir2=\u0026#34;/run/media/jared/Elements2/Backup\u0026#34; # Define excluded directories excluded_dirs=( \u0026#34;jared/Cloud_Storage/timeshift/\u0026#34; \u0026#34;jared/.cache/\u0026#34; \u0026#34;jared/.nv/\u0026#34; \u0026#34;jared/Cloud_Storage/Dropbox/.dropbox.cache/\u0026#34; ) You can add as many directories as you want here. The script compiles them before executing the rsync command.\nBackup function Then we have the backup function. This performs the following:\nTakes an input of the source and backup directories (defined above) Checks to see if the source directory exists Prompts for a year Prompts for a date Checks to make sure the backup destination directory exists Executes the backup 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 # Function to run a backup run_backup() { local source_dir=\u0026#34;$1\u0026#34; local backup_dir=\u0026#34;$2\u0026#34; # Check if the source directory exists if [ ! -d \u0026#34;$source_dir\u0026#34; ]; then echo \u0026#34;Error: Source directory \u0026#39;$source_dir\u0026#39; not found.\u0026#34; exit 2 fi # Input year and date echo \u0026#34;What is today\u0026#39;s year (YYYY):\u0026#34; read -r backup_year if [[ ! \u0026#34;$backup_year\u0026#34; =~ ^[0-9]{4}$ ]]; then echo \u0026#34;Error: Invalid year entered.\u0026#34; exit 3 fi echo \u0026#34;What is today\u0026#39;s date (YYYY-MM-DD):\u0026#34; read -r backup_date if [[ ! \u0026#34;$backup_date\u0026#34; =~ ^[0-9]{4}-[0-9]{2}-[0-9]{2}$ ]]; then echo \u0026#34;Error: Invalid date format. Use YYYY-MM-DD.\u0026#34; exit 4 fi # Check if the backup directory exists and run backup if [ -d \u0026#34;$backup_dir\u0026#34; ]; then echo \u0026#34;Backup directory \u0026#39;$backup_dir\u0026#39; found, backing up \u0026#39;$source_dir\u0026#39;...\u0026#34; # Build rsync exclude arguments exclude_args=() for dir in \u0026#34;${excluded_dirs[@]}\u0026#34;; do exclude_args+=(--exclude \u0026#34;$dir\u0026#34;) done # Perform the backup rsync -av --delete \u0026#34;${exclude_args[@]}\u0026#34; \u0026#34;$source_dir\u0026#34; \u0026#34;$backup_dir/Monthly/home/\u0026#34; mkdir -p \u0026#34;$backup_dir/$backup_year/$backup_date/\u0026#34; cp -al \u0026#34;$backup_dir/Monthly/\u0026#34; \u0026#34;$backup_dir/$backup_year/$backup_date/\u0026#34; echo \u0026#34;Backup for \u0026#39;$source_dir\u0026#39; completed successfully.\u0026#34; else echo \u0026#34;Error: Backup directory \u0026#39;$backup_dir\u0026#39; not found.\u0026#34; exit 5 fi } rsync is used to compare the files in the source to the Monthly backup directory and then update or delete files accordingly.\nOnce the files are copied over via rsync, then the cp command is used to link the files in the Monthly directory to the year/date/ diorectory. As the files change in the Monthly directory, then the link also changes. This method saves disk space because files are not copied over and over again. Any files that do not change are simply linked within the filesystem. The links take up a trivial amount of disk space, and the filesystem handles all of the heavy lifting associated with tracking which files are linked and where on the filesystem. There is no database, log, etc. required to track the individual files and/or their versions.\nRunning backups Finally, run the backups and confirm complete:\n1 2 3 4 5 6 # Run backups run_backup \u0026#34;$source_dir1\u0026#34; \u0026#34;$backup_dir1\u0026#34; run_backup \u0026#34;$source_dir2\u0026#34; \u0026#34;$backup_dir2\u0026#34; # Output confirmation echo \u0026#34;All backups completed successfully.\u0026#34; Results This script provides an incremental backup record organized by year and date:\nAccessing older backups is straightforward - simply navigate to the desired directory within the filesystem.\nDeleting old backups Deleting or removing old and out-of-date backups is as simple as deleting the directories. The filesystem links and files that are not linked elsewhere are removed from the filesystem, freeing up the disk space.\nReferences https://rsync.samba.org/ https://github.com/WayneD/rsync https://www.gnu.org/software/coreutils/ https://www.man7.org/linux/man-pages/man1/cp.1.html ","date":"2024-01-12T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2024/01/12/simple-incremental-bash-backup-script/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2024/01/12/simple-incremental-bash-backup-script/","title":"Simple Incremental Bash Backup Script"},{"content":"Introduction In this tutorial, we will write a python function that pulls data from Nasdaq Data Link through the tables API, adds relevant columns that are not present in the raw data, updates columns to allow for ease of use, and leaves the data in a format where it can then be used in time series analysis.\nNasdaq Data Link is a provider of numerous different types of financial data from many different asset classes. It provides API\u0026rsquo;s that allow access from Python, R, Excel, and other methods. It is available to institutional investors as well as individual retail investors.\nNasdaq Data Link Initial Data Retrieval We will use the data for AAPL for this example. This will give us a data set that requires some thought as to how the splits need to be handled as well as the dividends.\nWe\u0026rsquo;ll start with pulling the initial data set, with the first 10 rows shown as follows from the pandas dataframe:\nAnd the last 10 rows:\nFrom left to right, we have the following columns:\nRow number: 0 indexed, gives us the total number of rows/dates of data Ticker: The ticker symbol for our data Date: In the format YYYY-MM-DD Open: Daily open High: Daily high Low: Daily low Close: Daily close Volume: Volume of shares traded Dividend: Dividend paid on that date Split: Split executed on that date Adjusted Open: Daily open price adusted for all splits and dividends Adjusted High: Daily high price adusted for all splits and dividends Adjusted Low: Daily low price adusted for all splits and dividends Adjusted Close: Daily close price adusted for all splits and dividends Adjusted Volume: Daily volume price adusted for all splits Data questions The above information is a good starting point, but what if we are looking for the following answers?\nThe data shows a split value for every day, but we know the stock didn\u0026rsquo;t split every day. What does this represent? What is the total cumulative split ratio? What is the split ratio at different points in time? What is the adjusted share price without including the dividends? This would be needed for any time series analysis. What is the dividend dollar value based on an adjusted share price? What would the share price be if the stock hadn\u0026rsquo;t split? We\u0026rsquo;ll add columns and modify as necessary to answer the above questions and more.\nAssumptions The remainder of this tutorial assumes the following:\nYou have the Nasdaq Data Link library installed You have the pandas library installed You have the OpenPyXL library installed Python function to modify the data The following function will perform the desired modifications:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 # This function pulls the data for the specific fund from from Nasdaq Data # Link and adds many missing columns # Imports import nasdaqdatalink import pandas as pd import numpy as np # Add API key for reference to allow access to unrestricted data nasdaqdatalink.ApiConfig.api_key = \u0026#39;your_key\u0026#39; # Function definition def ndl_data_updater(fund): # Command to pull data # If start date and end date are not specified the entire data set is included df = nasdaqdatalink.get_table(\u0026#39;QUOTEMEDIA/PRICES\u0026#39;, ticker = fund, paginate=True) # Sort columns by date ascending df.sort_values(\u0026#39;date\u0026#39;, ascending = True, inplace = True) # Rename date column df.rename(columns = {\u0026#39;date\u0026#39;:\u0026#39;Date\u0026#39;}, inplace = True) # Set index to date column df.set_index(\u0026#39;Date\u0026#39;, inplace = True) # Replace all split values of 1.0 with NaN df[\u0026#39;split\u0026#39;] = df[\u0026#39;split\u0026#39;].replace(1.0, np.nan) # Create a new data frame with split values only df_splits = df.drop(columns = {\u0026#39;ticker\u0026#39;, \u0026#39;open\u0026#39;, \u0026#39;high\u0026#39;, \u0026#39;low\u0026#39;, \u0026#39;close\u0026#39;, \u0026#39;volume\u0026#39;, \u0026#39;dividend\u0026#39;, \u0026#39;adj_open\u0026#39;, \u0026#39;adj_high\u0026#39;, \u0026#39;adj_low\u0026#39;, \u0026#39;adj_close\u0026#39;, \u0026#39;adj_volume\u0026#39;}).dropna() # Create a new column for cumulative split df_splits[\u0026#39;Cum_Split\u0026#39;] = df_splits[\u0026#39;split\u0026#39;].cumprod() # Drop original split column before combining dataframes df_splits.drop(columns = {\u0026#39;split\u0026#39;}, inplace = True) # Merge df and df_split dataframes df_comp = pd.merge(df, df_splits, on=\u0026#39;Date\u0026#39;, how=\u0026#39;outer\u0026#39;) # Forward fill for all cumulative split values df_comp[\u0026#39;Cum_Split\u0026#39;].fillna(method = \u0026#39;ffill\u0026#39;, inplace = True) # Replace all split and cumulative split values of NaN with 1.0 to have complete split values df_comp[\u0026#39;split\u0026#39;] = df_comp[\u0026#39;split\u0026#39;].replace(np.nan, 1.0) df_comp[\u0026#39;Cum_Split\u0026#39;] = df_comp[\u0026#39;Cum_Split\u0026#39;].replace(np.nan, 1.0) # Calculate the non adjusted prices based on the splits only df_comp[\u0026#39;non_adj_open_split_only\u0026#39;] = df_comp[\u0026#39;open\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] df_comp[\u0026#39;non_adj_high_split_only\u0026#39;] = df_comp[\u0026#39;high\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] df_comp[\u0026#39;non_adj_low_split_only\u0026#39;] = df_comp[\u0026#39;low\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] df_comp[\u0026#39;non_adj_close_split_only\u0026#39;] = df_comp[\u0026#39;close\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] df_comp[\u0026#39;non_adj_dividend_split_only\u0026#39;] = df_comp[\u0026#39;dividend\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] # Calculate the adjusted prices based on the splits df_comp[\u0026#39;Open\u0026#39;] = df_comp[\u0026#39;non_adj_open_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;High\u0026#39;] = df_comp[\u0026#39;non_adj_high_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;Low\u0026#39;] = df_comp[\u0026#39;non_adj_low_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;Close\u0026#39;] = df_comp[\u0026#39;non_adj_close_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;Dividend\u0026#39;] = df_comp[\u0026#39;non_adj_dividend_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;Dividend_Pct_Orig\u0026#39;] = df_comp[\u0026#39;dividend\u0026#39;] / df_comp[\u0026#39;close\u0026#39;] df_comp[\u0026#39;Dividend_Pct_Adj\u0026#39;] = df_comp[\u0026#39;Dividend\u0026#39;] / df_comp[\u0026#39;Close\u0026#39;] # Export data to excel file = fund + \u0026#34;_NDL.xlsx\u0026#34; df_comp.to_excel(file, sheet_name=\u0026#39;data\u0026#39;) # Output confirmation print(f\u0026#34;The last date of data for {fund} is: \u0026#34;) print(df_comp[-1:]) print(f\u0026#34;NDL data updater complete for {fund} data\u0026#34;) return print(f\u0026#34;--------------------\u0026#34;) Let\u0026rsquo;s break this down line by line.\nImports First, we need to import the required libraries:\n1 2 3 4 # Imports import nasdaqdatalink import pandas as pd import numpy as np NDL API Key To gain access to anything beyond the free tier, you will need to provide your access key:\n1 2 # Add API key for reference to allow access to unrestricted data nasdaqdatalink.ApiConfig.api_key = \u0026#39;your_key\u0026#39; Download data as a dataframe Moving on to the function definition, we have the command to pull data from NDL. There are two separate APIs - the time series and the tables. The syntax is different, and some data sets are only available as one or the other. We will use the tables API for this tutorial.\n1 2 3 # Command to pull data # If start date and end date are not specified the entire data set is included df = nasdaqdatalink.get_table(\u0026#39;QUOTEMEDIA/PRICES\u0026#39;, ticker = fund, paginate=True) In the example above, the fund is an input parameter to the function.\nThe 'QUOTEMEDIA/PRICES' is the data source that we are accessing.\nThere are many other arguments that we could pass in the above, including specifying columns, period start date, period end date, and others. Nasdaq as a few examples to get you started:\nhttps://docs.data.nasdaq.com/docs/python-tables\nRunning:\ndf.head(10) Gives us:\nSort columns by date Next, we will sort the columns by date ascending. By default, the dataframe is created with the data sorted by descending date, and we want to change that:\n1 2 # Sort columns by date ascending df.sort_values(\u0026#39;date\u0026#39;, ascending = True, inplace = True) The inplace = True argument specifies that the sort function should take effect on the existing dataframe.\nNow, running:\ndf.head(10) Gives us:\nSetting the date as the index Next, we will rename the date column from \u0026lsquo;date\u0026rsquo; to \u0026lsquo;Date\u0026rsquo;, and set the index to be the Date column:\n1 2 3 4 5 # Rename date column df.rename(columns = {\u0026#39;date\u0026#39;:\u0026#39;Date\u0026#39;}, inplace = True) # Set index to date column df.set_index(\u0026#39;Date\u0026#39;, inplace = True) Now, running:\ndf.head(10) Gives us:\nCalculating splits The next sections deal with the split column. So far we have only seen a split value of 1.0 in the data, but we\u0026rsquo;ve only looked at the first 10 and last 10 rows. Are there any other values? Let\u0026rsquo;s check by running:\n1 df_not_1_split = df[df[\u0026#39;split\u0026#39;] != 1.0] And checking the first 10 rows:\ndf_not_1_split.head(10) Gives us:\nSo we now know that the stock did in fact split several times. Next, we will replace all of the 1.0 split values - because they are really meaningless - and then create a new dataframe to deal with the splits.\n1 2 # Replace all split values of 1.0 with NaN df[\u0026#39;split\u0026#39;] = df[\u0026#39;split\u0026#39;].replace(1.0, np.nan) This gives us:\nWe will now create a dataframe with only the split values:\n1 2 3 4 # Create a new data frame with split values only df_splits = df.drop(columns = {\u0026#39;ticker\u0026#39;, \u0026#39;open\u0026#39;, \u0026#39;high\u0026#39;, \u0026#39;low\u0026#39;, \u0026#39;close\u0026#39;, \u0026#39;volume\u0026#39;, \u0026#39;dividend\u0026#39;, \u0026#39;adj_open\u0026#39;, \u0026#39;adj_high\u0026#39;, \u0026#39;adj_low\u0026#39;, \u0026#39;adj_close\u0026#39;, \u0026#39;adj_volume\u0026#39;}).dropna() Which gives us:\nCreating a column for the cumulative split will provide an accurate perspective on the stock price. We can do that with the following:\n1 2 # Create a new column for cumulative split df_splits[\u0026#39;Cum_Split\u0026#39;] = df_splits[\u0026#39;split\u0026#39;].cumprod() Which gives us:\nWe will then drop the original split column before combining the split data frame with the original data frame, as follows:\n1 2 # Drop original split column before combining dataframes df_splits.drop(columns = {\u0026#39;split\u0026#39;}, inplace = True) Which gives us:\nCombining dataframes Now we will merge the df_split dataframe with the original df dataframe so that the cumulative split column is part of the original dataframe. We will call this data frame df_comp:\n1 2 # Merge df and df_split dataframes df_comp = pd.merge(df, df_splits, on=\u0026#39;Date\u0026#39;, how=\u0026#39;outer\u0026#39;) We are using the merge function of pandas, which includes arguments for the names of both dataframes to be merged, the column to match between the dataframes, and the parameter for the type of merge to be performed. The outer argument specifies that all rows from both dataframes will be included, and any missing values will be filled in with NaN if there is no matching data. This ensures that all data from both dataframes is retained.\nRunning:\ndf_comp.head(10) Gives us:\nForward filling cumulative split values From here, we want to fill in the rest of the split and Cum_Split values. This is done using the forward fill function, which for all cells that have a value of NaN will fill in the previous valid value until another value is encountered. Here\u0026rsquo;s the code:\n1 2 # Forward fill for all cumulative split values df_comp[\u0026#39;Cum_Split\u0026#39;].fillna(method = \u0026#39;ffill\u0026#39;, inplace = True) Running:\ndf_comp.head(10) Gives us:\nAt first glance, it doesn\u0026rsquo;t look like anything changed. That\u0026rsquo;s because there wasn\u0026rsquo;t any ffill action taken on the initial values until pandas encountered a valid value to then forward fill. However, checking the last 10 rows:\ndf_comp.tail(10) Gives us:\nWhich is the result that we were expecting. But, what about the first rows from 12/12/1980 to 6/15/1987? We can fill those split and Cum_Split values with the following code:\n1 2 3 # Replace all split and cumulative split values of NaN with 1.0 to have complete split values df_comp[\u0026#39;split\u0026#39;] = df_comp[\u0026#39;split\u0026#39;].replace(np.nan, 1.0) df_comp[\u0026#39;Cum_Split\u0026#39;] = df_comp[\u0026#39;Cum_Split\u0026#39;].replace(np.nan, 1.0) Now, checking the first 10 rows:\ndf_comp.head(10) Gives us:\nWith this data, we now know for every day in the data set the following pieces of information:\nIf the stock split on that day What the total split ratio is up to and including that day Calculating adjusted and non-adjusted prices From here, we can complete our dataset by calculating the adjusted and non-adjusted prices using the cumulative split ratios from above:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # Calculate the non adjusted prices based on the splits only df_comp[\u0026#39;non_adj_open_split_only\u0026#39;] = df_comp[\u0026#39;open\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] df_comp[\u0026#39;non_adj_high_split_only\u0026#39;] = df_comp[\u0026#39;high\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] df_comp[\u0026#39;non_adj_low_split_only\u0026#39;] = df_comp[\u0026#39;low\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] df_comp[\u0026#39;non_adj_close_split_only\u0026#39;] = df_comp[\u0026#39;close\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] df_comp[\u0026#39;non_adj_dividend_split_only\u0026#39;] = df_comp[\u0026#39;dividend\u0026#39;] * df_comp[\u0026#39;Cum_Split\u0026#39;] # Calculate the adjusted prices based on the splits df_comp[\u0026#39;Open\u0026#39;] = df_comp[\u0026#39;non_adj_open_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;High\u0026#39;] = df_comp[\u0026#39;non_adj_high_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;Low\u0026#39;] = df_comp[\u0026#39;non_adj_low_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;Close\u0026#39;] = df_comp[\u0026#39;non_adj_close_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;Dividend\u0026#39;] = df_comp[\u0026#39;non_adj_dividend_split_only\u0026#39;] / df_comp[\u0026#39;Cum_Split\u0026#39;][-1] df_comp[\u0026#39;Dividend_Pct_Orig\u0026#39;] = df_comp[\u0026#39;dividend\u0026#39;] / df_comp[\u0026#39;close\u0026#39;] df_comp[\u0026#39;Dividend_Pct_Adj\u0026#39;] = df_comp[\u0026#39;Dividend\u0026#39;] / df_comp[\u0026#39;Close\u0026#39;] Included above is the adjusted dividends values. For any time series analysis, not only are the adjusted prices needed, but so are the adusted dividends. Remember, we already have the adjusted total return prices - those come directly from NDL.\nExport data Next, we want to export the data to an excel file, for easy viewing and reference later:\n1 2 3 # Export data to excel file = fund + \u0026#34;_NDL.xlsx\u0026#34; df_comp.to_excel(file, sheet_name=\u0026#39;data\u0026#39;) And verify the output is as expected:\nOutput confirmation Finally, we want to print a confirmation that the process succeeded along withe last date we have for data:\n1 2 3 4 5 # Output confirmation print(f\u0026#34;The last date of data for {fund} is: \u0026#34;) print(df_comp[-1:]) print(f\u0026#34;NDL data updater complete for {fund} data\u0026#34;) print(f\u0026#34;--------------------\u0026#34;) And confirming the output:\nReferences https://docs.data.nasdaq.com/docs https://docs.data.nasdaq.com/docs/tables-1 https://docs.data.nasdaq.com/docs/time-series https://docs.data.nasdaq.com/docs/python\n","date":"2023-12-24T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2023/12/24/nasdaq-data-link-tables-api-data-retrieval/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2023/12/24/nasdaq-data-link-tables-api-data-retrieval/","title":"Nasdaq Data Link Tables API Data Retrieval"},{"content":"Introduction In this tutorial, we will write a python function that imports an excel export from Bloomberg, removes ancillary rows and columns, and leaves the data in a format where it can then be used in time series analysis.\nExample of a Bloomberg excel export We will use the SPX index data in this example. Exporting the data from Bloomberg using the excel Bloomberg add-on yields data in the following format:\nData modifications The above format isn\u0026rsquo;t horrible, but we want to perform the following modifications:\nRemove the first six rows of the data Convert the 7th row to become column headings Rename column 2 to \u0026ldquo;Close\u0026rdquo; to represent the closing price Remove column 3, as we are not concerned about volume Export to excel and make the name of the excel worksheet \u0026ldquo;data\u0026rdquo; Assumptions The remainder of this tutorial assumes the following:\nYour excel file is named \u0026ldquo;SPX_Index.xlsx\u0026rdquo; The worksheet in the excel file is named \u0026ldquo;Worksheet\u0026rdquo; You have the pandas library installed You have the OpenPyXL library installed Python function to modify the data The following function will perform the modifications mentioned above:\n1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 # This function takes an excel export from Bloomberg and # removes all excess data leaving date and close columns # Imports import pandas as pd # Function definition def bb_data_updater(fund): # File name variable file = fund + \u0026#34;_Index.xlsx\u0026#34; # Import data from file as a pandas dataframe df = pd.read_excel(file, sheet_name = \u0026#39;Worksheet\u0026#39;, engine=\u0026#39;openpyxl\u0026#39;) # Set the column headings from row 5 (which is physically row 6) df.columns = df.iloc[5] # Set the column heading for the index to be \u0026#34;None\u0026#34; df.rename_axis(None, axis=1, inplace = True) # Drop the first 6 rows, 0 - 5 df.drop(df.index[0:6], inplace=True) # Set the date column as the index df.set_index(\u0026#39;Date\u0026#39;, inplace = True) # Drop the volume column try: df.drop(columns = {\u0026#39;PX_VOLUME\u0026#39;}, inplace = True) except KeyError: pass # Rename column df.rename(columns = {\u0026#39;PX_LAST\u0026#39;:\u0026#39;Close\u0026#39;}, inplace = True) # Sort by date df.sort_values(by=[\u0026#39;Date\u0026#39;], inplace = True) # Export data to excel file = fund + \u0026#34;.xlsx\u0026#34; df.to_excel(file, sheet_name=\u0026#39;data\u0026#39;) # Output confirmation print(f\u0026#34;The last date of data for {fund} is: \u0026#34;) print(df[-1:]) print(f\u0026#34;Bloomberg data conversion complete for {fund} data\u0026#34;) return print(f\u0026#34;--------------------\u0026#34;) Let\u0026rsquo;s break this down line by line.\nImports First, we need to import pandas:\n1 import pandas as pd Import excel data file Then import the excel file as a pandas dataframe:\n1 2 3 4 5 # File name variable file = fund + \u0026#34;_Index.xlsx\u0026#34; # Import data from file as a pandas dataframe df = pd.read_excel(file, sheet_name = \u0026#39;Worksheet\u0026#39;, engine=\u0026#39;openpyxl\u0026#39;) Running:\ndf.head(10) Gives us:\nSet column headings Next, set the column heading:\n1 2 # Set the column headings from row 5 (which is physically row 6) df.columns = df.iloc[5] Now, running:\ndf.head(10) Gives us:\nRemove index heading Next, remove the column heading from the index column:\n1 2 # Set the column heading for the index to be \u0026#34;None\u0026#34; df.rename_axis(None, axis=1, inplace = True) Note: The axis=1 argument here specifies the column index.\nNow, running:\ndf.head(10) Gives us:\nDrop rows Next, we want to remove the first 6 rows that have unneeded data:\n1 2 # Drop the first 6 rows, 0 - 5 df.drop(df.index[0:6], inplace=True) Note: When dropping rows, the range to drop begins with row 0 and continues up to - but not including - row 6.\nNow, running:\ndf.head(10) Gives us:\nSet index Next, we want to set the date column as the index:\n1 2 # Set the date column as the index df.set_index(\u0026#39;Date\u0026#39;, inplace = True) Now, running:\ndf.head(10) Gives us:\nDrop the \u0026ldquo;PX_VOLUME\u0026rdquo; column Next, we want to drop the volume column:\n1 2 3 4 5 # Drop the volume column try: df.drop(columns = {\u0026#39;PX_VOLUME\u0026#39;}, inplace = True) except KeyError: pass For some data records, the volume column does not exist. Therefore, we try, and if it fails with a KeyError, then we assume the \u0026ldquo;PX_VOLUME\u0026rdquo; column does not exist, and just pass to move on.\nNow, running:\ndf.head(10) Gives us:\nRename the \u0026ldquo;PX_LAST\u0026rdquo; column Next, we want to rename the \u0026ldquo;PX_LAST\u0026rdquo; column as \u0026ldquo;Close\u0026rdquo;:\n1 2 # Rename column df.rename(columns = {\u0026#39;PX_LAST\u0026#39;:\u0026#39;Close\u0026#39;}, inplace = True) Now, running:\ndf.head(10) Gives us:\nSort data Next, we want to sort the data starting with the oldest date:\n1 2 # Sort by date df.sort_values(by=[\u0026#39;Date\u0026#39;], inplace = True) Now, running:\ndf.head(10) Gives us:\nExport data Next, we want to export the data to an excel file, for easy viewing and reference later:\n1 2 3 # Export data to excel file = fund + \u0026#34;.xlsx\u0026#34; df.to_excel(file, sheet_name=\u0026#39;data\u0026#39;) And verify the output is as expected:\nOutput confirmation Finally, we want to print a confirmation that the process succeeded along withe last date we have for data:\n1 2 3 4 5 # Output confirmation print(f\u0026#34;The last date of data for {fund} is: \u0026#34;) print(df[-1:]) print(f\u0026#34;Bloomberg data conversion complete for {fund} data\u0026#34;) print(f\u0026#34;--------------------\u0026#34;) And confirming the output:\nReferences https://www.bloomberg.com/professional/support/software-updates/\n","date":"2023-11-15T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2023/11/15/cleaning-bloomberg-excel-export/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2023/11/15/cleaning-bloomberg-excel-export/","title":"Cleaning A Bloomberg Data Excel Export"},{"content":"Introduction Here are my notes for some of the more commonly used git commands along with initial setup for git in Linux.\nInstallation To begin, install as follows for Arch Linux:\n$ sudo pacman -Sy git Or\n$ yay git Pacman will include all required depencies.\nInitial configuration First, set your name and email address:\n$ git config --global user.name \u0026quot;Firstname Lastname\u0026quot; $ git config --global user.email \u0026quot;email@address.com\u0026quot; Then, set your preferred text editor (if you have one). I use nano:\n$ git config --global core.editor \u0026quot;nano\u0026quot; You can verify the updates with:\n$ git config --global core.editor Alternatively, you can edit the git configuration directly with:\n$ git config --global --edit Store credentials In 2021, GitHub disabled authentication via password and now requires authentication with a token. The following command sets up the credential helper, where it will store your token in ~/.git-credentials:\n$ git config --global credential.helper store After you log in during a git push with your username and token, the username or email address and token will be stored in the above location.\nNote: The token is stored in plain text, so use caution if that is a concern.\nCloning repositories Repositories can be cloned with the following:\n$ git clone https://github.com/\u0026lt;username\u0026gt;/\u0026lt;repository\u0026gt;.git Updating repositories The local record of a repository can be updated with the following command:\n$ cd \u0026lt;repository\u0026gt;/ $ git pull Adding, committing, and pushing Any files or directories that have been added, modified, or removed can be add to the list of changes to be pushed with the following command:\n$ git add . This function stages files that have been modified and deleted but new files that you have not added are not affected:\n$ git commit -a This function commits any staged changes:\n$ git commit -m \u0026quot;message\u0026quot; These arguments can be stacked as follows:\n$ git commit -am \u0026quot;Add your commit message here\u0026quot; Note: Without add, commit will handle any changes to files that have been modified or deleted, but will not incorporate any files that have been created.\nThen finally pushed:\n$ git push If, for some reason, you would like to reset a commit:\n$ git reset These commands can be chained together with the AND operator:\n$ git add . \u0026amp;\u0026amp; git commit -am \u0026quot;Add your commit message here\u0026quot; \u0026amp;\u0026amp; git push Stashing changes If you forget to update a repository before making changes, you can \u0026ldquo;stash\u0026rdquo; those changes and then re-apply them after running git pull.\nFirst, stash the changes:\n$ git stash Then, update the local record of the repository:\n$ git pull Finally, re-apply the changes you previously made:\n$ git stash apply This has proven to be very useful for me when I forget to update a repository before making edits to the code.\nReferences https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens https://git-scm.com/book/en/v2/Getting-Started-First-Time-Git-Setup#_first_time https://git-scm.com/book/en/v2/Appendix-C%3A-Git-Commands-Setup-and-Config https://git-scm.com/book/en/v2/Git-Tools-Credential-Storage#_credential_caching https://git-scm.com/book/en/v2/Git-Tools-Stashing-and-Cleaning https://www.geeksforgeeks.org/difference-between-chaining-operators-in-linux/ ","date":"2023-10-16T00:00:00Z","image":"https://www.jaredszajkowski.com/stack/2023/10/16/git-quick-start-guide/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2023/10/16/git-quick-start-guide/","title":"Git Quick Start Guide"},{"content":"Introduction If anyone uses Zoom to record or Panopto to host recordings and later wants to access the recordings, here\u0026rsquo;s a simple linux bash script to download the video file and acompanying subtitles. For a while I used zoomdl, but it is no longer under active development, and I began running into various issues about a year ago. I stumbled upon yt-dlp and found it under active development and quite extensive.\nThis tutorial requires you to have a \u0026ldquo;cookies\u0026rdquo; text file, which needs to contain the cookies export in the Netscape HTTP format of the Zoom cookies after logging in.\nInstall cookie editor Install the cookie editor extension. I personnally use it with Microsoft Edge, but there are similar extensions for Chrome, Firefox, etc.\nModify export format Change the preferred cookie export format to Netscape HTTP Cookie File in the extension options. It is necessary to export in this format, otherwise yt-dlp will not be able to read the cookies.txt file correctly.\nLog in to Zoom or Panopto Log in to Zoom or Panopto in your browser. Be sure to remain logged in while exporting the cookies.\nExport cookies The export button is at the top fo the window. It copies the cookies to your clipboard, which then need to be pasted into a text file (I have my saved as cookies.txt), which yt-dlp will then read when it executes.\nInstall yt-dlp In Arch Linux, yt-dlp can be found with:\n$ yay yt-dlp Or:\n$ sudo pacman -Sy yt-dlp Create bash script for Zoom Save the following code to a text file (my bash script file name is yt-dlp-zoom.sh):\n1 2 3 4 5 6 #!/bin/bash echo What is the link? read link yt-dlp --referer \u0026#34;https://zoom.us/\u0026#34; --cookies /path/to/cookies/file/cookies_zoom.txt -o \u0026#34;%(title)s-%(id)s.%(ext)s\u0026#34; --write-subs $link Create bash script for Panopto Save the following code to a text file (my bash script file name is yt-dlp-panopto.sh):\n1 2 3 4 5 6 #!/bin/bash echo What is the link? read link yt-dlp --cookies /path/to/cookies/file/cookies_panopto.txt -o \u0026#34;%(title)s-%(id)s.%(ext)s\u0026#34; --write-subs $link Change permissions Modify the permissions of the bash scripts to allow execution:\n$ chmod +x yt-dlp-zoom.sh $ chmod +x yt-dlp-panopto.sh Execute the scripts Execute the bash script with either ./yt-dlp-zoom.sh or ./yt-dlp-panopto.sh, copy and paste the link into the shell prompt for the video that you would like to save, and it should download the video and the subtitles.\nReferences https://ostechnix.com/yt-dlp-tutorial/ ","date":"2023-10-01T00:00:00Z","image":"https://www.jaredszajkowski.com/stack/2023/10/01/yt-dlp-with-zoom-and-panopto/banner_1.svg","permalink":"https://www.jaredszajkowski.com/stack/2023/10/01/yt-dlp-with-zoom-and-panopto/","title":"Using yt-dlp With Zoom And Panopto"},{"content":"Introduction This is the basic framework that I use to install Arch Linux, with a few changes catered to the Lenovo ThinkPad E15 Gen 2. I have found that this is a decent mid range laptop, excellent linux compatibility, great keyboard, and overall provides a good value.\nGetting started This tutorial assumes the following:\nYou are booting from a USB drive with the Arch install ISO. Wireless or wired network is detected and drivers are configured automatically. You want drive encrytion on your root partition, but not on your boot/efi/swap partitions. Verify UEFI boot mode The following command should show directory without error:\n# ls /sys/firmware/efi/efivars Configure wireless network The following command will drop you into the iwd daemon:\n# iwctl From there:\n# device list # station *device* scan # station *device* get-networks # station *device* connect *SSID* Verify internet connectivity # ping archlinux.org Update system clock # timedatectl set-ntp true # timedatectl status Disks, partition table \u0026amp; partitions The following assumes that your NVME drive is found as /dev/nvme0n1. Partitions will then be /dev/nvme0n1p1 and so on.\nWipe disk List disks:\n# fdisk -l Wipe all file system records:\n# wipefs -a /dev/nvme0n1 Create new partition table Open nvme0n1 with gdisk:\n# gdisk /dev/nvme0n1 Create GPT partition table with option \u0026ldquo;o\u0026rdquo;.\nCreate EFI partition Create new EFI partition w/ 550mb with option \u0026ldquo;n\u0026rdquo;, using the following parameters:\nPartition #1 Default starting sector +550M Change partition type to EFI System (ef00) Create boot partition Create new boot partition w/ 550mb with option \u0026ldquo;n\u0026rdquo;, using the following parameters:\nPartition #2 Default starting sector +550M Leave default type of 8300 Create swap partition The old rule of thumb used to be that a swap partition should be the same size as the amount of memory in the system, but given the typical amount of memory in modern systems this is obviously no longer necessary. For my system with 16 or 32 GB of memory, a swap of 8 GB is rarely even used.\nCreate new Swap partition w/ 8GB with option \u0026ldquo;n\u0026rdquo;, using the following parameters:\nPartition #3 Default starting sector +8G Change to linux swap (8200) Create root partition Create new root partition w/ remaining disk space with option \u0026ldquo;n\u0026rdquo;, using the following parameters:\nPartition #4 Default starting sector Remaining space Linux LUKS type 8309 And then exit gdisk.\nWrite file systems EFI partition Write file system to new EFI System partition:\n# cat /dev/zero \u0026gt; /dev/nvme0n1p1 # mkfs.fat -F32 /dev/nvme0n1p1 Boot partition Then boot partition:\n# cat /dev/zero \u0026gt; /dev/nvme0n1p2 # mkfs.ext2 /dev/nvme0n1p2 Root partition Prepare root partition w/ LUKS:\n# cryptsetup -y -v luksFormat --type luks2 /dev/nvme0n1p4 # cryptsetup luksDump /dev/nvme0n1p4 # cryptsetup open /dev/nvme0n1p4 archcryptroot # mkfs.ext4 /dev/mapper/archcryptroot # mount /dev/mapper/archcryptroot /mnt I use archcryptroot for the name of my encrypted volume, but change as necessary.\nSwap partition Then swap:\n# mkswap /dev/nvme0n1p3 # swapon /dev/nvme0n1p3 Create mount points # mkdir /mnt/boot # mount /dev/nvme0n1p2 /mnt/boot # mkdir /mnt/boot/efi # mount /dev/nvme0n1p1 /mnt/boot/efi System install Install base packages # pacstrap /mnt base base-devel linux linux-firmware grub-efi-x86_64 efibootmgr Generate fstab # genfstab -U /mnt \u0026gt;\u0026gt; /mnt/etc/fstab Enter new system # arch-chroot /mnt /bin/bash Set clock # ln -sf /usr/share/zoneinfo/America/Chicago /etc/localtime # hwclock –systohc Generate locale In /etc/locale.gen uncomment only: en_US.UTF-8 UTF-8\n# locale-gen In /etc/locale.conf, you should only have this line: LANG=en_US.UTF-8\n# nano /etc/locale.conf Set hostname \u0026amp; update hosts # echo linuxmachine \u0026gt; /etc/hostname Update /etc/hosts with the following:\n127.0.0.1 localhost ::1 localhost 127.0.1.1 linuxmachine.localdomain linuxmachine Set root password # passwd Update /etc/mkinitcpio.conf \u0026amp; generate initrd image Edit /etc/mkinitcpio.conf with the following:\nHOOKS=(base udev autodetect modconf block keymap encrypt resume filesystems keyboard fsck) Then run:\n# mkinitcpio -p linux Install grub # grub-install --target=x86_64-efi --efi-directory=/boot/efi --bootloader-id=ArchLinux Edit /etc/default/grub so it includes a statement like this:\nGRUB_CMDLINE_LINUX=\u0026quot;cryptdevice=/dev/nvme0n1p4:archcryptroot resume=/dev/nvme0n1p3\u0026quot; Generate final grub configuration:\n# grub-mkconfig -o /boot/grub/grub.cfg Exit \u0026amp; reboot # exit # umount -R /mnt # swapoff -a # reboot To be continued.\n","date":"2023-09-29T00:00:01Z","image":"https://www.jaredszajkowski.com/stack/2023/09/29/arch-linux-install/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2023/09/29/arch-linux-install/","title":"Arch Linux Install"},{"content":"Hello World Welcome to my website. This is meant to serve as a place for me to publish various posts from my explorations into Arch Linux, data science, quant finance, and other topics.\nThe theme has been adopted from the Hugo Theme Stack produced by Jimmy Cai.\nThis is the only theme that I have found that checks all of the following boxes:\nTheme for the static site generator Hugo Includes modules for archives Includes tags and topics/categories Includes built-in search functionality Simple interface that is easily navigable Highly extensible including modules for image galleries, posts, comment capabilities, etc. It is hosted on GitHub pages. I followed the install instructions that the theme author provided, including using GitHub codespace for editing in the cloud. There are only a few details that I ran into that he did not mention.\nDon\u0026rsquo;t forget to run Hugo to build the site. This creates the public directory, which is where the static site files are located. Make sure to update the branch to be gh-pages under Settings -\u0026gt; Pages -\u0026gt; Build and deployment -\u0026gt; Branch in GitHub. Make sure to remove the public directory from the .gitignore file. Otherwise GitHub will ignore the public directory and your site will show the README.md instead of the Hugo site. The site can be updated either through codespace, or locally as long as Hugo and it\u0026rsquo;s required dependencies have been installed.\nUpdating and pushing changes The simple command after making any changes and to push those updates is as follows:\n$ hugo \u0026amp;\u0026amp; git add . \u0026amp;\u0026amp; git commit -am \u0026quot;Updating site\u0026quot; \u0026amp;\u0026amp; git push This can be put in a bash script to make it easier. Save the following as git-update.sh:\n1 2 3 4 5 6 #!/bin/bash echo What is the commit message? read message hugo \u0026amp;\u0026amp; git add . \u0026amp;\u0026amp; git commit -am \u0026#34;$message\u0026#34; \u0026amp;\u0026amp; git push Change permissions:\n$ chmod +x git-update.sh And then execute:\n$ ./git-update.sh References Here\u0026rsquo;s the full list of resources I referenced for deploying Hugo with GitHub pages:\nhttps://www.o11ycloud.com/posts/gh_hugo/ https://github.com/CaiJimmy/hugo-theme-stack https://medium.com/@magstherdev/github-pages-hugo-86ae6bcbadd\n","date":"2023-09-26T00:00:00Z","image":"https://www.jaredszajkowski.com/stack/2023/09/26/hello-world/cover.jpg","permalink":"https://www.jaredszajkowski.com/stack/2023/09/26/hello-world/","title":"Hello World"}]