*Copyright reserved by Maxwell Margenot - edited Oct 11, 2018 in Quantopian*

After trawling through the stats on the many, many backtests that everyone in the community has developed, we at Quantopian have determined that a new series of template algorithms is warranted. We have the benefit of looking at community activity cross-sectionally and we have seen that there is a lot of strong development work on technical signals (mean reversal and momentum chiefly). Unsurprisingly, there are many fewer algorithms that have tapped into fundamental signals for their sources of predictive power. This algo is a great starting point for anyone looking to incorporate fundamentals-driven signals into their repertoire.

# CapEx Vol

Cash flow volatility is a fairly well studied metric that is often considered a proxy for uncertainty at a firm level. In this template algorithm we’ve extended that idea to see if firms with relatively more volatile capital expenditures (e.g. spending on things like new buildings, plants, equipment, etc) are also more unpredictable and, by extension, riskier and more likely to underperform firms with lower relative capex volatility. For a bit more academic detail take a look at this SSRN paper.

As we look to expand the set of algorithms receiving allocations over the next few months we expect to give preference to new ideas that take advantage of a broader range of fundamental factors.

To get started, clone this algorithm, improve it with your own ideas, and submit it to the Quantopian Daily Contest.

**N.B.** As implemented here, this algo doesn’t fully meet all of the criteria for entry in the daily contest so we’re leaving that as an “exercise for the reader”.

```
import numpy as np
import quantopian.algorithm as algo
import quantopian.optimize as opt
from quantopian.pipeline import Pipeline
from quantopian.pipeline.factors import CustomFactor
from quantopian.pipeline.filters import QTradableStocksUS
from quantopian.pipeline.data import Fundamentals
ZSCORE_FILTER = 3 # Maximum number of standard deviations to include before counting as outliers
ZERO_FILTER = 0.001 # Minimum weight we allow before dropping security
class TEM(CustomFactor):
"""
TEM = standard deviation of past 6 quarters' reports
"""
window_length = 390
def compute(self, today, assets, out, asof_date, capex, total_assets):
values = capex/total_assets
for column_ix in range(asof_date.shape[1]):
_, unique_indices = np.unique(asof_date[:, column_ix], return_index=True)
quarterly_values = values[unique_indices, column_ix]
if len(quarterly_values) < 6:
quarterly_values = np.hstack([
np.repeat([np.nan], 6 - len(quarterly_values)),
quarterly_values,
])
out[column_ix] = np.std(quarterly_values[-6:])
def initialize(context):
algo.attach_pipeline(make_pipeline(), 'alpha_factor_template')
# Schedule our rebalance function
algo.schedule_function(func=rebalance,
date_rule=algo.date_rules.month_start(), # week_start()
time_rule=algo.time_rules.market_open(),
half_days=True)
# Record our portfolio variables at the end of day
algo.schedule_function(func=record_vars,
date_rule=algo.date_rules.every_day(),
time_rule=algo.time_rules.market_close(),
half_days=True)
def make_pipeline():
# Setting up the variables
capex_vol = TEM(
inputs=[Fundamentals.cap_ex_reported_asof_date,
Fundamentals.cap_ex_reported,
Fundamentals.total_assets],
mask=QTradableStocksUS()
)
alpha_factor = -capex_vol
alpha_w = alpha_factor.winsorize(min_percentile=0.02,
max_percentile=0.98,
mask=QTradableStocksUS())
alpha_z = alpha_w.zscore()
alpha_weight = alpha_z / 100.0
outlier_filter = alpha_z.abs() < ZSCORE_FILTER
zero_filter = alpha_weight.abs() > ZERO_FILTER
finite_filter = alpha_weight.isfinite()
universe = QTradableStocksUS() & \
outlier_filter & \
zero_filter & \
finite_filter
pipe = Pipeline(
columns={
'alpha_weight': alpha_weight
},
screen=universe
)
#return pipe
return pipe
def before_trading_start(context, data):
#context.pipeline_data = algo.pipeline_output('alpha_factor_template')
# Alex: only top/bottom 10
TOP_BOT_NUM = 250
TOP_BOT_NUM = min(TOP_BOT_NUM, len(algo.pipeline_output('alpha_factor_template'))//2)
df = algo.pipeline_output('alpha_factor_template').sort_values('alpha_weight').head(TOP_BOT_NUM)
context.pipeline_data = df.append(algo.pipeline_output('alpha_factor_template').sort_values('alpha_weight').tail(TOP_BOT_NUM))
def record_vars(context, data):
# Plot the number of positions over time.
algo.record(num_positions=len(context.portfolio.positions))
algo.record(leverage=context.account.leverage)
def rebalance(context, data):
# Retrieve pipeline output
pipeline_data = context.pipeline_data
alpha_weight = pipeline_data['alpha_weight']
alpha_weight_norm = alpha_weight / alpha_weight.abs().sum()
objective = opt.TargetWeights(alpha_weight_norm)
# No constraints currently
constraints = []
algo.order_optimal_portfolio(
objective=objective,
constraints=constraints
)
```

BUILD OUR QUANTLAND