The Infamous Story of ENRON

The story of Enron is a story of greed and how a Houston-based energy company rocketed to the top echelon of Corporate America before losing everything.

From stodgy Oil & Gas merger, to high-flying corporate giant, to an astonishing demise

Formed from the merger of Houston Natural Gas and InterNorth in 1985, Enron began with humble roots. Kenneth Lay was an enterprising economics graduate from Missouri who learned the ropes of the oil and gas business early while obtaining his PhD in economics in 1970 and working his way up to management at InterNorth before it was purchased by HNG.

For years the company had solid (if not spectacular) results and even overcame a couple near-fatal financial disasters that resulted from oil futures and origination guarantees deals gone bad. An almost overly-proud Harvard MBA from Illinois, Jeff Skilling joined Enron's ranks after several years of consulting for the energy giant as part of Enron's cozy relationship with McKinsey and Company.

Enron's fatal flaw was the belief that accounting "creativity" can permanently hide fraud 

In time, Skilling became COO and began to call for the mass hiring of elite MBA types and math gurus which he transformed into his "complex deal making" army. He became particularly close with Enron's oddball finance and accounting veteran Andrew Fastow who paired the brains of Jeff's army with the creativity of accounting fraud to make Enron appear, at least to investors and banks, as an extravagant capital-generating machine.

Fastow and his crack team of corporate fraudsters developed a network of shell companies known as SPEs or "special purpose entities" and used these as vehicles for hiding losses and booking fictitious deals- to the tune of several billion dollars of imaginary capital and unreported losses. Quarter after quarter, when Enron divisions were struggling to "hit the numbers" that Wall Street analysts expected- Andy would step in to save the day with his SPE magic that- at least temporarily- made bad news go away.

Another favorite method of Fastow and Skilling was to use "mark to market" accounting treatment of their energy deals. Meaning that they reported- as current income- all estimated future income of the life of the deal- for virtually all the deals they did. This is great when things are going good but it is an obviously untenable situation. While Enron was flashing the gaudy mark to market income figures to the Street, the future required them to actually service those deals- and never book another accounting profit as the entire deal's income has already been reported.

Enron's pursuit of Wall Street's favor made a mockery of their Code of Ethics

Enron, which had once been a company with deep roots in Oil & Gas and was hands-on in developing pipelines and sourcing fossil fuels for delivery contracts, was now in the business of trading on energy futures that bore little to no resemblance to true tangible "present values". Everything was speculation. Everything was reduced to hedges and bets. Nothing was real anymore. And it all collapsed under the weight of its own obfuscation.

Sure there were other reasons Enron collapsed. There was the comical Enron Broadband Services which tried to take on the early internet giants like AOL, and went..... nowhere. There were notorious global deals in places like India and England that became financial albatrosses which only Fastow's shell games could attempt to mask- for a time. But it was really just simple greed and criminal accounting.

Jeff Skilling Harvard MBA abstract mastermind, avoider of details and implementation

Even the once-proud accounting firm Arthur Anderson would be brought down by the fall of Enron and eventually file for bankruptcy. They had some protestations early on about the use of SPEs and the anachronistic manner of applying profits and losses, but ultimately they went along with and signed off on the grossly improper financial reporting.

The Justice Department, the SEC and FBI had long been looking at the company by the time Enron's offices were raided on January 22nd, 2002. What followed was the trial and conviction of several Enron executives including Fastow, Skilling and Lay who were sentenced for an assortment of fraud and conspiracy charges related to the heart of the scandal.

Andy Fastow was given a reduced 6 year sentence after agreeing to cooperate and testify against his former bosses. He was released from prison in 2011 and is now a popular speaker at business ethics and accounting fraud conferences.

Skilling received 24 years in federal prison for his role. He was released to a Texas half-way house on August 30th of 2018.

Ken Lay died of a heart attack while awaiting sentencing.

The biggest losers of Enron's demise were Enron employees and common stockholders who bet big on Enron's future

The timeline, web of deceit and cast of characters in this tragedy is truly fascinating. Rebecca Mark, Ken Rice, Lou Pai, and so many more interesting personalities are woven into this spectacular story that is told best by the people who (literally) wrote the book. For a comprehensive look into this business debacle, the award-winning book and documentary can be found here:

The Smartest Guys in the Room book by Bethany McLean and Peter Elkind

ENRON: The Smartest Guys in the Room

In the end, this was a tragedy of obscene hubris and ultimate humility. The ironic thing is that they had a solid business model and were it not for the lies that enabled inflated financial reporting, Enron- albeit a smaller and less glamorous Enron- would likely still be in business today.

Capital Gains (Losses) and Capital Gains Tax

Capital gains are often thought of in the context of profiting from the sale of some stock or other security-based financial product. Capital losses on the other hand, are the opposite (the loss incurred from the sale of stock). It is important to remember however, that capital gains and capital losses can also include other sales such as the sale of a vehicle, the sale of a home, the sale of an antique, etc.



Capital gains tax is paid by sellers (both businesses and consumers) who have profited from the sale of some asset (bonds, stocks in other businesses, company equipment that was sold for profit). 

Capital loss occurs when an asset is sold for less than was purchased. The amount of this sale is usually exempt (deductible) from taxes up to a certain amount.



Commodities and Securities Futures

"A futures contract is an agreement to buy or sell an asset at a future date at an agreed-upon price"

Futures markets such as the New York Board of Trade and the Chicago Mercantile Exchange facilitate the trading of futures contracts. Futures trading is often thought of only as raw materials (commodities), however financial products or "securities" are also traded in futures markets:

Commodities: A commodity is a raw material that has value and is more or less in constant demand (think- milk, eggs, pork, beef, chicken, lumber, iron, salt, crude oil, coal, etc.).

Securities (Financial): A security is a financial product such as an interest rate, the price of a stock, the value of some kind of debt like CDOs.

A recent history of returns on commodities futures by year and type


Futures trading is simply buyers betting on the future value of some product from the sellers. In commodities this could be a day trader speculating that the price of oil is about to skyrocket and buying contracts for purchases of oil at a lower price (he/she hopes).

Remember that futures trading is not limited to commodities

In securities futures, an example would be a buyer entering a contractual agreement to purchase some amount of stock for an agreed upon price at some future date. This would be to the buyer's advantage only if the price of the stock price on the future date is higher than the price agreed to in the futures contract.

At the heart of this kind of trading (and one could argue all trading) is the idea of betting for (+) or hedging against (-) the inevitable fluctuation of future value.


Reference: https://finance.zacks.com/futures-vs-commodities-5663.html

Continuous Integration

"Continuous Integration (CI) is a development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early."




The idea behind CI is that by having all developers continuously tracking (pulling down changes from)- and incrementally integrating their branch/feature code into- a "master" branch (or some common branch that facilitates CI of all other branches), problems that stem from incompatible or "not easily merge-able" features surface at the first detection of incompatibility, as opposed to at the end of feature branch development when any incompatibilities are magnified, and result in time-consuming redesign efforts to make things merge and interoperate cleanly.

In short, CI is used to nip potential integration problems in the bud.

Dividends


Dividends are a company's optional distribution of (typically) cash to stockholders and provide another way to earn money from investing beyond growing the value of one's portfolio.


A dividend is defined as "a sum of money paid regularly (typically quarterly) by a company to its shareholders out of its profits (or reserves)".

A dividend yield is an expression of the dividend amount relative to the company's current share price. You can calculate the current dividend yield for a given year by dividing the total dividend paid for that year or the following year (or any 12 month period) by the current stock price.

Some companies regularly pay out a cash dividend and can make their stock more attractive by doing so. Johnson Controls (JCI) for instance, has managed to pay a quarterly dividend every year since 1887. They paid a total dividend of $1.04 in 2018 and the stock price as of today is $31.21.




There are two ways to calculate a company's current dividend yield: (1) by using what are called "forward dividends" or (2) by using "trailing dividends". Trailing uses the preceeding 12 months while forward uses the expected payouts in the proceeding 12 months. As of today (1/2/2019) using trailing dividends, or to be more clear- in relation to their 2018 total dividend payout"- JCI's dividend yield was:

$1.04 / $31.21

...or 3.3%.


As you can see from the charts above, General Electric and Honeywell have paid out cash dividends consistently for years. But GE has recently clawed back these payouts dramatically. This is probably due to GE's ongoing restructuring and spin-off efforts.

Reference:

https://www.nasdaq.com/symbol/jci/dividend-history


Collateralized Debt Obligations (CDOs)

Collateralized Debt Obligations are units of packaged debt, sometimes referred to as "Frankenstein debt" which consists of various kinds of debt obligations (auto, home, credit card, student loans, corporate debt, etc.) of various credit ratings (AAA, AA, A, BBB, BB, B, CCC, CC, etc.).

"Originally developed as instruments for the corporate debt markets, after 2002 CDOs became vehicles for refinancing mortgage-backed securities." -Wikipedia


The idea behind this type of investment is that although it contains lots of high-risk debt (that may well default), that risk is offset by the better rated debt in the CDO package.

There are also CDOs known as "CDOs squared". These are also simply packages of variously rated debt, but with an additional layer of abstraction (obfuscation). Instead of various cash-backed assets and other kinds of direct claims on debt in the bundle, CDO^2 consist of pieces or "tranches" of other CDOs.

Additionally, there are Synthetic CDOs and CDSs. A Synthetic CDO is not backed by debt assets but rather derivatives of debt assets known as "Credit Default Swaps" (CDSs), which are basically CDO insurance. The buyer of a CDS makes periodic premium payments in much the same way as premiums for home and auto insurance.

CDSs provide a way for investors to hedge CDO investments. If a credit event (default on a CDO's underlying debt asset) occurs, the buyer of a credit default swap is protected from losses. If no credit event occurs, the seller of the CDS continues to collect the premium payments for the duration of the term of the CDS.

Crazy stuff, huh? Be careful, Wall Street.. Lehman Brothers never saw it coming... 😶

2008 was obviously the wake-up call, trillions in wealth vanished as values crashed to Earth

Price Discrimination

Price Discrimination is the act of selling the same product or service at different prices to different buyers in order to match differing levels of demand. It is used to ensure business from lower demand markets and earn the maximum possible profit from higher demand markets. This can be illustrated in the case of your grandparent or child getting a discount at the movie theater because they tend to have a lower demand than the average moviegoer.

Examples of Price Discrimination

"Price Equilibrium" (PE) is the price point at which a Supply Curve and Demand Curve intersect. Any price charged above the PE will result in more profit (seller surplus) and any price below PE will result in less profit (consumer surplus, missed opportunity by seller). Pricing products and services is done through the process known as Marginal Cost Analysis.

Price discrimination can be quite problematic when it is applied on the basis of ethnicity or socioeconomic status.

Just a matter of risk data? Or a racially biased algorithm used by banks?

Although the discriminatory practice known as "redlining" has been outlawed for over 50 years, banks continue to charge higher mortgage rates to non-white consumers. From the bank's perspective they would argue that it is coincidence and simply reflects the consumer's credit and a higher risk they are taking on. Others would argue that minority loan-seekers are being priced out of the American Dream because of the color of their skin.

Gas stations tend to have higher-than-average prices in low income areas because the customers in these areas have less nearby options, are often less mobile and are in general less discriminating than shoppers in a wealthy suburb who can leverage their environment of more competition, their mobility and in turn be more selective in their consuming habits (which is to say, more likely to sharply increase or decrease demand if a price is not in equilibrium).


Now in some cases price discrimination makes perfect sense. Take for example a hardware store in Arizona and a hardware store in Wisconsin who are both selling snowblowers. The store in Arizona is almost assuredly going to sell their snowblowers for far less as the demand for snowblowers is very low in that area of the country.

But in Wisconsin, there is virtually year-round demand as it snows every year, and so the Wisconsin store is likely to charge much more than the Arizona store. Furthermore, even within Wisconsin, stores will charge less for snowblowers in the summer than in the winter (when demand is higher).

Companies differentiate prices to match demand for different types of consumers

With the exception of 4th degree price discrimination, when a price is different for different types of consumption of the same product or service- it is because demand for that product or service is different among consumers and so companies set the prices accordingly.

TwickrTape

Real-time scrolling Tweets related to financial news alerts and updates. Just enter the Twitter handle you want to see a marquee of live streaming tweets from.

TL;DR - the working app is here: https://twickrtape.azurewebsites.net
..znd here on Google Play Store: https://play.google.com/store/apps/details?id=io.cordova.twicktape

Use button in lower-left to toggle different Twitter handles

The code is a hodgepodge of various references and vestiges of past personal projects. My initial aim was to get something working end-to-end as I envisioned, and I was able to achieve that pretty easily through the Twitter API and .NET.

The process is standard API stuff- first you authenticate and receive a token which you can then subsequently pass to prove that your requests are valid. Then, using simple (REST API) HTTP GET, we get the content (Twitter Timeline information) that we are interested in.

The gist of the code is .NET C# (an ASP.NET Controller method) and listed below:

  public ActionResult GetTwickr(string handle = "business")  
     {  
       // Set your own keys and screen name  
       var oAuthConsumerKey = "XXXXXXXXXXXXX"; // "API key";  
       var oAuthConsumerSecret = "XXXXXXXXXXXXXXXXXXXXX"; // "API secret key";  
       var oAuthUrl = "https://api.twitter.com/oauth2/token";  
       var screenname = "@" + handle; // default Twitter display current status  
       // Authenticate  
       var authHeaderFormat = "Basic {0}";  
       var authHeader = string.Format(authHeaderFormat,  
         Convert.ToBase64String(Encoding.UTF8.GetBytes(Uri.EscapeDataString(oAuthConsumerKey) + ":" +  
         Uri.EscapeDataString((oAuthConsumerSecret)))  
       ));  
       var postBody = "grant_type=client_credentials";  
       HttpWebRequest authRequest = (HttpWebRequest)WebRequest.Create(oAuthUrl);  
       authRequest.Headers.Add("Authorization", authHeader);  
       authRequest.Method = "POST";  
       authRequest.ContentType = "application/x-www-form-urlencoded;charset=UTF-8";  
       authRequest.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;  
       using (Stream stream = authRequest.GetRequestStream())  
       {  
         byte[] content = ASCIIEncoding.ASCII.GetBytes(postBody);  
         stream.Write(content, 0, content.Length);  
       }  
       authRequest.Headers.Add("Accept-Encoding", "gzip");  
       WebResponse authResponse = authRequest.GetResponse();  
       // deserialize into an object  
       TwitAuthenticateResponse twitAuthResponse;  
       using (authResponse)  
       {  
         using (var reader = new StreamReader(authResponse.GetResponseStream()))  
         {  
           System.Web.Script.Serialization.JavaScriptSerializer js = new System.Web.Script.Serialization.JavaScriptSerializer();  
           var objectText = reader.ReadToEnd();  
           twitAuthResponse = JsonConvert.DeserializeObject<TwitAuthenticateResponse>(objectText);  
         }  
       }  
       try  
       {  
         // Get timeline info  
         var timelineFormat = "https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name={0}&include_rts=1&exclude_replies=1&count=5";  
         var timelineUrl = string.Format(timelineFormat, screenname);  
         HttpWebRequest timeLineRequest = (HttpWebRequest)WebRequest.Create(timelineUrl);  
         var timelineHeaderFormat = "{0} {1}";  
         timeLineRequest.Headers.Add("Authorization", string.Format(timelineHeaderFormat, twitAuthResponse.token_type, twitAuthResponse.access_token));  
         timeLineRequest.Method = "GET";  
         WebResponse timeLineResponse = timeLineRequest.GetResponse();  
         var timeLineJson = string.Empty;  
         string scrolltxt = string.Empty;  
         using (timeLineResponse)  
         {  
           using (var reader = new StreamReader(timeLineResponse.GetResponseStream()))  
           {  
             timeLineJson = reader.ReadToEnd();  
           }  
         }  
         // deserialize into an object  
         dynamic obj = JsonConvert.DeserializeObject(timeLineJson);  
         foreach (var r in obj.Root)  
         {  
           scrolltxt += " ***** " + r.text;  
         }  
         var model = new LoggedInUserTimelineViewModel { TimelineContent = scrolltxt, Handle = handle };  
         return View("TwickrMain", model);  
       }  
       catch (Exception e) { }  
       return View("TwickrMain");  
     }  

Next, using Newtonsoft,Json JsonConvert() function, we deserialize the JSON response and zero in on the 'text' property of each entity (tweet) in the array of JSON results (tweets).

And finally, using some basic Bootstrap HTML, JavaScript and CSS we are able to wire up a very simple, but effective app that interacts with the Twitter API in real-time.

Reference: https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html

Using PowerShell and youtube-dl to automate grunt work

Earlier this evening I was tasked by my wife to create audio files from some (DRM-free) YouTube content. I knew how to do this via a manual process, however after the 5th manual file conversion I started to remember the youtube-dl Python project and figured there has to be an easier and quicker way to do this file conversion grunt work.

And sure enough... youtube-dl (with some help from PowerShell) does the trick.

Expected input and output

For starters (we are assuming you have already installed Python 3), you will need to have pip install two modules:

 pip install youtube-dl  
 pip install ffmpeg  

Next, create a youtube-uris.txt with the URI of each song you want to convert on separate lines of the file.

Finally, you can use the following PowerShell script to copy the song URIs into a string array which is then iterated through, converting each array item into a playable audio file (mp4), saved in whatever directory you run the script from.

 [string[]]$arrayFromFile = Get-Content -Path 'C:\sand\youtube-dl-sandbox\youtube-uris.txt'  
 foreach ($uri in $arrayFromFile) {  
   youtube-dl -f 140 $uri  
  }  

That's it. This script can be used for automating other manual/repetitive tasks requiring a procedure to be run against every item in a large list.


Reference: https://github.com/rg3/youtube-dl

Generate Secure Machine Key Section for Web.config via PowerShell

Machine Keys are used in ASP.NET for securing machines that are part of a web farm as well as for sharing encrypted application session and state information.

This PowerShell script (function) can be called (once run and saved to your PS session) via,

"PS C:\: Generate-MachineKey"


With the output from this PS function, you can copy and paste to your web.config ie:
 <configuration>  
  <system.web>  
   <machineKey ... />  
  </system.web>  
 </configuration>  

Generate-MachineKey function definition/PS script
 # Generates a <machineKey> element that can be copied + pasted into a Web.config file.  
 function Generate-MachineKey {  
  [CmdletBinding()]  
  param (  
   [ValidateSet("AES", "DES", "3DES")]  
   [string]$decryptionAlgorithm = 'AES',  
   [ValidateSet("MD5", "SHA1", "HMACSHA256", "HMACSHA384", "HMACSHA512")]  
   [string]$validationAlgorithm = 'HMACSHA256'  
  )  
  process {  
   function BinaryToHex {  
     [CmdLetBinding()]  
     param($bytes)  
     process {  
       $builder = new-object System.Text.StringBuilder  
       foreach ($b in $bytes) {  
        $builder = $builder.AppendFormat([System.Globalization.CultureInfo]::InvariantCulture, "{0:X2}", $b)  
       }  
       $builder  
     }  
   }  
   switch ($decryptionAlgorithm) {  
    "AES" { $decryptionObject = new-object System.Security.Cryptography.AesCryptoServiceProvider }  
    "DES" { $decryptionObject = new-object System.Security.Cryptography.DESCryptoServiceProvider }  
    "3DES" { $decryptionObject = new-object System.Security.Cryptography.TripleDESCryptoServiceProvider }  
   }  
   $decryptionObject.GenerateKey()  
   $decryptionKey = BinaryToHex($decryptionObject.Key)  
   $decryptionObject.Dispose()  
   switch ($validationAlgorithm) {  
    "MD5" { $validationObject = new-object System.Security.Cryptography.HMACMD5 }  
    "SHA1" { $validationObject = new-object System.Security.Cryptography.HMACSHA1 }  
    "HMACSHA256" { $validationObject = new-object System.Security.Cryptography.HMACSHA256 }  
    "HMACSHA385" { $validationObject = new-object System.Security.Cryptography.HMACSHA384 }  
    "HMACSHA512" { $validationObject = new-object System.Security.Cryptography.HMACSHA512 }  
   }  
   $validationKey = BinaryToHex($validationObject.Key)  
   $validationObject.Dispose()  
   [string]::Format([System.Globalization.CultureInfo]::InvariantCulture,  
    "<machineKey decryption=`"{0}`" decryptionKey=`"{1}`" validation=`"{2}`" validationKey=`"{3}`" />",  
    $decryptionAlgorithm.ToUpperInvariant(), $decryptionKey,  
    $validationAlgorithm.ToUpperInvariant(), $validationKey)  
  }  
 }  

Accessing SQL Server Data in R

Importing SQL Server data into R for analysis is pretty straightforward and simple. You will obviously need R installed on your machine. The following R code will connect to your SQL Server database (using R Studio):

 library(RODBC)  
 dbconnection <- odbcDriverConnect('driver={SQL Server};server=.;database=CLARO;trusted_connection=true')  
 initdata <- sqlQuery(dbconnection,paste('SELECT * FROM [CLARO].[dbo].[Fielding];')) 


SELECT data from a SQL Server database output in R Studio


Accessing SQL Server Data in Python

So you want to access Microsoft SQL Server from your Python script(s)?

After weeding out some long-abandoned and/or nonworking solutions, I discovered a very simple Python ODBC driver that works with virtually all SQL Servers since MSSQL 2005 called "pyodbc". 

First, you will need to install this MSSQL ODBC (13.1 or 17 should work) component on your machine in addition to installing the pyodbc driver.

Next, get the pyodbc module for Python by running this from Windows command prompt:

pip install pyodbc

Then open up a python shell using 'py' or 'python' and enter the following after editing configuration values to match your development environment:

 import pyodbc  
 cnxn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=localhost;DATABASE=WideWorldImporters;UID=DemoUser;PWD=123Password')  
 cursor = cnxn.cursor()  
 #Sample of a simple SELECT  
 cursor.execute("SELECT TOP (100) Comments, count(*) FROM WideWorldImporters.Sales.Orders GROUP BY Comments")  
 row = cursor.fetchone()   
 while row:   
   print(row[0] + ': ' + str(row[1]))  
   row = cursor.fetchone()  

Running this code will result in the below if you have configured everything correctly (note this example makes use of the Microsoft SQL Server demo WorldWideImporters database):


Reference: https://docs.microsoft.com/en-us/sql/connect/python/pyodbc/step-3-proof-of-concept-connecting-to-sql-using-pyodbc?view=sql-server-2017

.udl for DbConnection Check

This is a useful method to quickly check SQL credentials and/or RDBMS connectivity if working on a Windows OS. Just create a file in any editor (ie. Notepad) and save it with .udl extension which makes it a Microsoft Data Link file type. Then, right-click and inspect file Properties >> "Connection" tab.

Credit to a former colleague of mine (thanks Gene David!) who showed me how to use this simple but very useful trick.