PBIRS REST API (public, completely customizable API)
If you use PBIRS to publish PBI reports to a centralized portal, or just want the ability to programmatically control your PBI development, deployments and system maintenance, there is a PBIRS API that you may not be aware of that can supercharge your PBI development environment.
It is not published but it is completely documented on SwaggerHub.com
It bears a close resemblance to the original SSRS v1 and v2 APIs, but in addition to all the SSRS functionality, it provides functionality for managing PBI reports, data sources, PBI data refreshes (data is refreshed on demand in SSRS reports), scheduling of PBI reports, managing uploaded ExcelWorkbooks and many other really useful features that developers can leverage to get the most out of your Power BI environment.
Publishing, updating PBI reports, data sources, posting ExecutionLog entries and more is made available via the PBIRS API
Power BI REST API (MSFT proprietary API)
And then there is the all-powerful Power BI REST API which serves as a successor to the ReportExecution2005.asmx and ReportService2010.asmx Reporting Services SOAP APIs that the original SSRS REST API was built upon.
This API is not publicly documented (the SDKs are, however, which suffices for most usage scenarios) and not as extensible as original the SSRS REST API (ie. you need to authenticate with Entra and an online MSFT Power BI account; no custom authentication is possible), but it provides most all of the abundant paginated report and data source functionality in the original SSRS SOAP API along with a cornucopia of new API operations for managing Power BI users, workspaces and content.
The following is an example of using the Power BI REST API with PowerShell (using the MicrosoftPowerBIMgmt PowerShell module, if connecting via .NET you'd use the Microsoft.PowerBI.Api .NET library/Nuget package; an example of using the Power BI REST API in .NET can be found here):
This demonstrates getting a list of PBI workspaces, reports and getting an individual PBI report by name
With these two APIs organizations can continue to manage their SSRS/PBIRS environments and start managing their cloud-based Power BI environments- programmatically.
Microsoft has announced the inevitable- SSRS is not long for the world.
SSRS is now PBIRS and sadly, after 2033, PBIRS is likely going away in favor of costly, subscription-based MSFT reporting
There will apparently be no new features for RDL-based paginated reports moving forward unless MSFT changes course. All MSFT investments in data visualization and business intelligence are going to Power BI and other "Power Automate" tools.
For over 20 years, SSRS and RDL-based paginated reports have served a vital role supporting business intelligence and data visualization
This is not entirely a bad thing but I do have some thoughts on the sad reality that is the current state of Power BI, namely- its massive feature and extensibility limitations with respect to its predecessor, SSRS.
The first big one is report access. SSRS reports were available to anyone in the enterprise. And with custom authentication like extRSAuth, reports could be made available to any logged-in SaaS customer or otherwise authenticated client- for free. Most everything in Power BI requires a monthly subscription.
Note that the "Free" pricing tier has significant limitations (not very useful to develop reports in isolation...)
The second big one is scheduled reports and report archiving for auditing and comparison purposes via report snapshot functionality. Scheduled reports and snapshots are made possible via the PBI Report Server on-prem engine, but as the PBIRS on-prem engine is only supported through January of 2033 the question for organizations considering continuing with RDL-based schedulable and snapshottable reports is- "will we be cut off (unsupported) after January 2033?".
Sadly, everything is moving to a subscription-based model and this has everything to do with short-term "value demonstration" and the prioritization of short-term profits over long-term product excellence and long-term customer satisfaction. (more on this in a future post...).
Not an ideal way to present an external customer-facing report...
If I had to make a bet on the future, based on MSFT's current avaricious and short-sighted, "Subscription-and-AI-everything-customer-opinion-and-developer's-be-damned*" business direction- SSRS and its successor PBIRS will be completely phased out after 2033 and the only option will be Power BI reports, with Power BI offering some expensive, newfangled version of schedulable .pbix reports. Probably called something ridiculous like "Fabric Delivery Service". I doubt they will continue to support snapshot auditing capabilities, but they should.
Lastly, a sad consequence of the universal move away from code sharing, "actually free/unlimited use" software and software extensibility in general, is that Power BI binary files (.pbix) cannot be utilized outside of being decompiled by a Power BI editor. In the heydays of SSRS, you could view any .rdl, .rsd or .rds file and immediately inspect a report, dataset or data source (XML) file and make edits on the fly through a text editor or an API. This is no longer possible.
There have been innumerable SSRS scripts implemented at organizations to update large batches of data sources, datasets and reports to point to new data sources, data fields or to make mass updates to column text, report labeling or table definitions. So the removal of this functionality is kind of a big (raw) deal.
Paginated RS Reports (.rdl)
RDL report files are XML-based and report definitions are clearly defined in the files; fortunately PBIRS is supporting paginated reports and the PBI Report Server database and engine for several more years.
Interactive PBI Visualizations (.pbix)
PBI report files are illegible compiled binary code; report definitions can only be decompiled, viewed and modified through tools like the online Power BI editor or downloadable Power BI Desktop programs. PBI interactive reports are the MSFT-recommended approach for all data reporting (except apparently, their own as is demonstrated below in the example of a SSMS 21/SQL Server 2025 standard, RDL-based database report).
There is a long way to go before SSRS reports disappear from the MSBI landscape completely however; SSRS will continue to be supported until January 11, 2033according to Microsoft's recently posted Reporting Services consolidation FAQ.
In fact, if you look at Microsoft products like SSMS 21, you'll see regular old SSRS/RDL reports being rendered through the Report Viewer control when running reports for SSIS execution history, Job execution history, Database Standard Reports and many other SSMS reports.
Right-click any SQL Server database and you can view a suite of useful (if outdated looking) Standard Reports
Ngl these reports look very 2008ish (3D is not recommended for serious dataviz)- but they are indeed SSRS/RDL
Comparing search terms via Google Trends it is clear that the push to Power BI accelerated in 2017 and has ramped up to the point where general interest in SSRS is negligent compared to interest in Power BI, despite SSRS offering robust, extensible on-prem report solutions for free and Power BI be buggy, considerably "locked down" and charging a relatively expensive monthly per-user subscription.
Google search interest in Power BI eclipsed SSRS in 2017 and has only risen since
But this is merely how marketing-sales-product singularity and product obsolescence works. If Galactus (MSFT) deems it so, it'll be so.
😕
😞
*we have come a long, long way from then-Microsoft CEO Steve Ballmer's infamous but earnest, "Developers!!, Developers!!, Developers!!" rant at a Windows conference in 2006. 😐
Each of these rules will help guide you to the delivery of a compelling and effective visual story
U.S. Presidential Elections 1900-2024
A picture really can depict 1000s of words
From the Industrial Revolution to WWI to the Great Depression to the New Deal to WWII to the Red Scare to the Civil Rights Movement to Neoconservatism to Neoliberalism to 9/11 to the Great Recession to MAGA, these small multiples charts beautifully encapsulate U.S. presidential elections and the attendant American political eras that fueled them in these past 124 years.
Major U.S. Economic Events
1929-1932 Great Depression
The gridlines are distracting and unnecessary but this chart, and the informational callouts, explain the stock market crash that caused the Great Depression
Post-WWII Boom
The United States was the biggest gainer among the countries whose GDP per capita spiked after WWII
Oil Prices and Inflation
The U.S. experienced huge inflation spikes in the 1970s due tooil price "shocks" and again in 2022 due to Russian oil sanctions and COVID stimulus
1980s Savings and Loan Crisis
Well over 1000 banks failed as a result of the money market and junk-bond deregulation that led to the S&L crisis; this hasn't happened before or since
Millenium Dot-Com Bubble
Between 1998 and 2002 the NASDAQ rose from ~2000 to over 5000 only to come crashing back to just over ~1000 by the end of the bubble burst
2007-2009 Great Recession
Due to internationally linked financialization, the Great Recession impacted not just the U.S. economy, but economies across the world
2020 COVID and U.S. Unemployment
This graph illustrates the impact that COVID had on U.S. unemployment, which subsided once vaccines were developed, re-opening economies worldwide
Music Sales
Fascinating composition bar chart showing how music sales have shifted from vinyl to cassettes to CDs to mp3s to streaming
Whenever you present a story about data you will invariably be displaying some values (along a time series axis, or in static/isolation) within a categorization-based visual. The four primary chart types are Comparison, Relationship, Distribution and Composition. We will briefly pose the questions that illustrate what each of these chart types aims to answer.
Comparison: How much of each subgroup exists in relation to the other subgroups?
Relationship: How did a value change over time? (or change in relation to some other non-temporal metric like "how are various foreign currency exchange rates impacted by the movement of the U.S. federal funds (interest) rate?")
Distribution: What is the concentration of values within different percentiles if you chart the data along a linear scale?
Composition: What are the sizes of each of the constituent parts that comprise the whole of the thing you are trying to depict or explain?
Other key data visualization concepts to know and always be considering...:
Avoid Chartjunk: The goal should be to encode as much information as possible using near-exclusively "data ink", and as low a level of "non-data ink" as possible. Charts and graphs that contain excessive non-data ink (also called "chartjunk"), which is any chart content that does not communicate information relevant to your data visualization, are only going to confuse the consumer of your data visualization and hinder the expression of the message your visualization is meant to convey.
Chartjunk should be ruthlessly excised wherever it is found. Only include the data ink that serves the communication of your data visualization. All extra clutter will detract from and degrade your dataviz and your message- which is the entire purpose of graphs and charts.
Know Your Audience: It is important that you know your audience. A chart presented in a scientific or academic journal is often expected to contain values derived from complex ratios and formulas and labeling using esoteric language; a chart presented for general consumers in a newspaper or magazine is not. You should have an idea of what the baseline expectations are for the data you are presenting and ensure that you communicate your visualization in a way that is easy for your target audience to understand.
Data Integrity: Many charts have been used as propaganda and to otherwise mislead people. This is done by using outright fake data or trying to elicit specious insights from thin data sets that do not provide a complete picture of what a particular data point or set of data points means within the context of other data it is a part of.
This is the entire view of _Report.cshtml which renders the RS report; see the extRSAuth project for how to enable non-Windows AD SSRS client authentication
The <iframe> is rendered with the URL of the report on your report server which renders a ReportViewer control view of the report
SQL Server 2025 introduces a convenient way to get data from a REST API endpoint directly through T-SQL and SQL Server utilities.
Outlined in lime green are the two items SQL Server 2025 handles well, discussed in this post
Prior ways of doing this usually involved using MSXML2.XMLHTTP (a COM object provided by Microsoft XML Core Services) through extended stored procedures, but with MSSQL 2025, there is a new SP, sp_invoke_external_rest_endpoint that is very readable and easy to use to get JSON (or XML) from an API response.
This brief article describes what an SP to get this data may look like, as well as the code to parse the JSON in the response to format the result as a table (vs. sending back all the JSON for the client to parse).
Here is an SP which fetches polling data for the Approval Polling data on current U.S. president Donald Trump:
SET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGO-- =============================================-- Author: colin fitzgerald-- Create date: 20250625-- Description: SP to fetch data from VoteHub API-- =============================================CREATEORALTERPROCEDURE dbo.sp_GetVoteHubPollingData
@LongView BIT=0ASBEGINSET NOCOUNT ON;
EXECUTE sp_configure 'external rest endpoint enabled', 1;
RECONFIGURE WITH OVERRIDE;
DECLARE@ret ASINT, @response AS NVARCHAR (MAX);
EXECUTE@ret = sp_invoke_external_rest_endpoint
@url = N'https://api.votehub.com/polls?poll_type=approval&subject=Trump',
@headers = N'{"Accept":"application/json"}',
@method='GET',
@response =@response OUTPUT;
;WITH ResponseContent AS
(SELECT [key], [value] FROM OPENJSON(@response)),
ResponseJson AS
(SELECT [value] FROM OPENJSON((SELECT [value] FROM ResponseContent WHERE [key]='result')))
SELECT--value,--id,
pollster,
[subject],
poll_type,
sample_size,
created_at,
approve,
disapprove
FROM ResponseJson
OUTER APPLY OPENJSON(value) WITH(
id nvarchar(255),
pollster nvarchar(255),
[subject] nvarchar(255),
poll_type nvarchar(255),
sample_size int,
created_at datetime,
approve decimal'$.answers[0].pct',
disapprove decimal'$.answers[1].pct'
)
WHERE created_at >=CASEWHEN@LongView =0THEN dateadd(mm, -3, getDate()) ELSE created_at ENDORDERBY created_at DESCEXECUTE sp_configure 'external rest endpoint enabled', 0;
RECONFIGURE WITH OVERRIDE;
ENDGO
And here is the design view of the data source for a report using this polling table data:
And here is the design view of the report that will use this data:
If we CREATE dbo.sp_GetVoteHubPollingData as stored procedure on a database (in this case, I created it in 'master') that our data source connects to, then we can deploy the report to a Report Server or Power BI Report Server and run it:
This is the report as rendered within Power BI Report Server's Portal using extRSAuth for authentication and authorization
This is the report as rendered within extRSAuth custom PBIRS reporting portal on extrs.net
Lots of neat stuff you can do with the new features of SQL Server 2025- check 'em out.
Next up: embedding a PBI version of this report in extrs.net, and embedding a HighCharts JS version of this report in extrs.net- all using this dbo.sp_GetVoteHubPollingData SP that uses sp_invoke_external_rest_endpoint.
If you are trying to add the built-in ASP.NET Core Identity Authentication scaffolding to an MVC application, you will notice that traversing between MVC controller actions and simple requests to Razor Pages doesn't work as transparently as you might expect.
To scaffold the Identity Razor Pages so that we can customize them, we first do the following:
Right-click the project and select, "Add >> New Scaffolded Item..."
Select the Identity scaffolding template
From the next screen, select the Razor pages that you would like to scaffold for overriding/customizing
Once you have the scaffolded pages you will need to run this from the Developer console to generate the necessary ASP.NET Core Identity database tables: dotnet ef database update
If you want to be able to get to the Login page from an MVC controller, you will; need to an [AllowAnonymous] attribute to the Login.cshtml Razor Page code-behind class, and you will need to call the Login page while referencing the Area path in which it is located within the project file structure.
The folder structure for an MVC project overriding the built-in Identity auth Razor pages via scaffolding
An MVC logout link that you want to redirect to an overridden Identity auth Login Razor page might look like this in your _Layout view:
The view link in the MVC's _Layout.cshtml
The controller redirect (note: new { area = "Identity" }); "/Pages/" is ignored for some reason..
In order to reach the Login Razor page (when not being authenticated) you will need to add the [AllowAnonymous] attribute to the LoginModel class a la:
With [AllowAnonymous] users will be able to reach the overridden Identity Login page from any controller action in your MVC application
An that's it. You can customize all of the front-end and back-end code of the scaffolded Identity Razor pages. This is an example of customization that uses the extRS service to create SSRS user accounts when a user registers a new Identity account for the app:
Scaffolding the Identity Razor pages allows you focus only on the authentication design and logic that you need to customize
I'm not sure why but much of this is not documented (and what is documented is not easily found).
Lastly, do keep in mind that in order to override certain items like "_LoginPartial", you need to rename the view, as was done with "_LoginPartial2.cshtml" in the extRS project, for example.
Assets Classes are "a group of marketable financial assets that have similar financial characteristics and behave similarly in the marketplace". They represent the different types of (usually) tangible things that can be owned and be assessed a "valuation".
Public Stocks (Equities)- shares of ownership in publicly-held companies
Stocks are listed on stock exchanges which are open to the public investment
Historically, have outperformed other investments over long periods
Most volatile in the short term
Returns and principal for a stock fluctuate over time, making funds from the eventual sale of the stock worth more or less than original cost (depends on the stock purchase price)
Private Equity (Private Stocks) - shares of ownership in privately-held and unlisted companies
Typically, only a few large investors
Usually focuses on short-term capital extraction and "sale for parts"
Often financed by leveraged buyouts (loading the target company with new debt to help the PE firm fund the acquisition of the target company)
Can be designed to turn around the fortunes/profitability of a distressed company by cutting costs, replacing management and changing company direction; in this case the investment is more long-term, but ultimately the goal of PE is to extract a large profit from the sale of the acquired company once its profitability (and thus valuation) has improved
A "property flip" is a minor/small-scale form of PE
Bonds/Notes/Bills (Fixed Income) - guaranteed bond investments
Pays a set rate of interest over a given period, then return the investor's principal
More stability than stocks
Value fluctuates due to current interest and inflation rates
Includes "guaranteed" or "risk-free" assets
Also includes money market instruments (short-term fixed income investments)
Often comprised of federal government or municipal bonds, notes or bills
Can also include corporate loans
The term used to describe this type of debt asset ("bond/note/bill") depends on the length of the debt instrument maturity, with "bonds" typically being a maturity of 10-20 years, "notes" being a maturity of 1-10 years and "bills" (like T-Bills) being a maturity of less than 1 year
Cash and Equivalents (Liquid Assets) - assets that can be quickly and easily converted into immediately usable currency without losing significant value
Checking and savings accounts
Certificates of deposit (CDs)
Money market funds
Treasury bills
Treasury notes
Commercial paper
Foreign currencies which are easily convertible to the currency you need to transact in
Liquid assets have the advantage of giving their owner the power to buy things without incurring any debt
Cryptocurrency - a "piece" of (usually limited) digital currency
Liquidity has increased with institutional adoption
Supply varies, some are finite or even designed to shrink in supply and thus deflationary
No inherent value or utility (like NFTs)
But(!), cryptocurrency underlies the transactional architecture of a bulk of all untraced, "black market" commercial activity worldwide
Barring meaningful cryptocurrency regulation, its value is unlikely to ever "go to zero" or not hold some significant value because of its usefulness in facilitating illegal monetary transactions and digital money laundering
Recently, interest in stocks and crypto has spiked while interest in bonds and private equity has remained more muted and stable
Real Estate - investment property (houses, stores, factories, land lots, etc.) and commercial real estate investments
Helps protect future purchasing power as property values typically rise with inflation
Values tend to rise and fall more slowly than stock and bond prices.
It is important to keep in mind that the real estate sector is subject to various risks, including fluctuation in underlying property values, interest rates (which directly influence mortgage rates, which usually compose a large part of any real estate purchase), eminent domain law, and potential environmental liabilities
Can include "Infrastructure as an asset class"- a broad category including highways, airports, rail networks, energy generation (utilities), energy storage and distribution (gas mains, pipelines etc.)
Can provide a long-term cash flow, a hedge against inflation, and diversification (low correlation with the top two traditional asset classes: equity and fixed income)
Commodities - physical goods such as gold, copper, crude oil, natural gas, wheat, corn and electricity
Can serve both as a value store in and of itself (in the case of things that don't expire, like precious metals) and as raw material for the construction and delivery of downstream physical goods and services
Helps protect future purchasing power as commodity values rise with inflation
Values tend to have low correlation with stock and bond prices
Price dynamics are unique: commodities become more volatile as prices rise
Interest in CRE and commodities has fallen precipitously since 2004...
I believe that a large part of the reason that the vast majority of software projects end up over budget and finish well past schedule is that all too often, both developers and product/project managers use minimizing language for things that turn out to not be anywhere near as minimal or as straightforward and simple as minimizing language makes them sound.
Focus on keeping the main thing the main thing
When discussing software development, especially when discussing the estimation of time required to complete work items, whenever you speak or hear a statement that contains the phrases, "it's just", "it's only", "it's simple", ''that's just boilerplate", "it's already baked in", "that's just [insert a design pattern while completely omitting the context, data structures and design logic the pattern will be applied to]", or in the Year of our Lord 2025: "ChatGPT will answer that"- run.
Run for the hills and do not return. I'm kidding. But do be very alert to phrases like these because they indicate a potential significant piece of your project that is being glossed over because someone has thought about it abstractly, but not in concrete (code implementation) terms.
This can be developers who are either over-confident and/or feel pressured to give low estimates so that the project schedule does not seem imperiled.
This can be managers who just haven't stepped into the code or discussed the logic enough with the developers to understand the complexity behind a series of words that describe a conceptual software design.
Under-promise and always account for unknowns which will lead to unforeseen roadblocks, detours and changes. If all goes according to plan (and it never does), you over-deliver on your over-estimates. If not, you have may have given yourself enough buffer to still meet the planned schedule and will have successfully accounted for the inevitable unknown.
If you are already on a schedule that is unrealistic and bound to not be met by the deadline, then all you can do is change scope (cut or delay features). If you insist on keeping all the planned features, and have the luxury of time, then you can only increase the time (lengthen the delivery schedule to a future date).
You can certainty try to keep the same calendar date for a release deadline and "just" throw more developers and managers at the project, hoping they can all work round the clock and in parallel to increase productivity, but this never works. Domain knowledge and a cadence of solid productivity and cooperation across teams takes a significant period of time for new hires to learn.
As a quote attributed to Warren Buffet says, "You can't produce a baby in one month by getting nine women pregnant."
And I repeat: Focus on keeping the main thing the main thing
When designing and project planning any significantly complex piece of software, all parties involved (the "stakeholders") must understand that a software project is not fixed- project schedules, planned features, and the human resources to implement the features are going to change.
Many thought that the move from top-heavy waterfall/SDLC-based approaches to Agile would solve this problem. But unfortunately, when Agile refuses to actually be "agile", the waterfall becomes an unnavigable white water rapid stream that is only slightly more conducive to building great things within a certain scheduled space of business time.
And in general, God bless a true Agile craftsmanship approach and all the time-tested statistical process control concepts it is based upon, but our software industry's hyper-reliance on "estimating" and "measuring" things that can unexpectedly and rapidly evolve (and oftentimes measuring the wrong things) does not jive with realistic long-term software planning objectives and almost to a "hyper-time-boxed project"- leads to the worst of all outcomes in the software business: mismanaged (or specifically "missed") expectations. Managment of expectations is everything.
Deliver the most critical parts of a customer's needs first and deliver them as a flawless piece of beautiful software. Working software- especially the end-to-end functioning of your application's most critical workflow- is paramount; everything else should follow from that and never get in its way.
You can iterate, make changes and add features later on.
Focus on outcomes over processes; lest you sink into the bog of minimizing language metastasized into maximally time-consuming (sometimes completely unnecessary) work items. And a project that seems to never get delivered or is (worse, because first impressions are everything...) delivered rife with show-stopping bugs.