I haven’t touched Linux in many years and after seeing Microsoft’s Build this year, I thought it would be a fun project to get into. Some things might seem a given to those used to running Linux, but as I haven’t done this in many years, it’s always wise to document.
Before You Begin
I did Ubuntu, the most popular, but also really liked Kali for it’s speed. I tried many through my learning phase, figuring out the settings and configurations. I similarly went with a light Desktop Environment (explained later) so I can just get a quick response and get something going.
You can get the Ubunto Distro (and a few others) off the Microsoft Store.
Update Your Distro
The value of installing Ubuntu first is that their documentation is great. You’ll have to …
Our iSeries (mainframe/as400) team has figured out a way to upgrade all our forms and can easily deposit these forms into a folder on the as400. They are looking for the fastest way to push these forms to our customers.
They ask: Is there a way I can (.NET Core Dev) retrieve their PDF statements from the mainframe (iSeries/AS400) for viewing online? The files are not exposed in Zend (the as400 webserver) but are in a specific “folder” (if you could call it that, the structure of as400 file system is different).
Though I actively employ their Zend system to utilize my own API for querying, etc, there was not much interest in this in their team other than knowing how I was calling their programs. They wanted this SPECIFIC folder to be accessed to retrieve statements.
I tried to propose a HTTP web service via Zend, other HTTP/PHP solutions… but in the end, I explained what I am currently doing: utilizing my own API to accept an authentication key with parameters (in json using PHP) and convert those parameters to the parameters they need and then calling a program. The program I call does the logic and …
- Azure Explorer is uploading too slow
- AZCopy copy feature will not upload. Error: “RESPONSE Status: 403 This request is not authorized to perform this operation using this permission.”
Today, I was trying to upload a directory (1.3gb) of images up for blob storage for public access. Though I often use Azure Storage Explorer for smaller uploads, it was proving impossible with this large directory. My connection came to a grinding halt and the files where taking impossibly too long to upload.
So, upon research, I learned that AZCopy is much more efficient with larger data transfers and after success, I found this to be true. However, using a direct URL as per the docs was giving me a 403 error, despite me being owner of container and container seeming to have proper permissions. So I decided to use an SAS key at end of URL (also in docs) and the apply the azcopy copy command.
Below is how I proceeded to get it working.
Even though all my projects are in .NET Core now, I rarely get the opportunity to use Identity because of my work with our backend Legacy system. Recently, though, I built a very lightweight SEO Management system for one of our sites (that allows a 3rd party to tweak our page titles, meta tags, etc) and wanted to give them user access and roles.
The entire project can be found on GitHub, but below is just a running list of sites I used to get this done, noting all the troubleshooting and stupid little mistakes I did along the way.
Adding Identity to an Already Existing Project
This part was relatively easy and the Microsoft documents provided an easy enough guide. I believe I had some issues:
CS1902 C# Invalid option for /debug; must be full or pdbonly – with the Data Migrations because I didn’t have EntityFramework installed. Basically, all errors in this phase were not as presented – they were mostly because I was lacking packages to migrate.
To ensure I had all the right packages installed, i used: Microsoft.AspNetCore.App -Version 2.2.6
Also, in this area, I decided not to put the connection string in appsettings.json, opting …
We had an issue a couple of days ago where two points of failure caused some downtime and so I spent most of today revisiting all my monitors to give us as much warning as possible if one of our services fails.
One particular API I set up lives on the as400 (iSeries) and operates off of it’s ZendCore server in PHP. I quickly scripted a page to verify: connection between ZendCore PHP and db2, that the connection to that endpoint was in fact SSL, and on what port.
The PHP code is very simple and returns a JSON string that looks like this:
The slight problem today was figuring out how to get Site 24×7 to not just verify the link, but check the JSON values and verify it is what I need. If not, then send the alert. In this case, I want to verify AS400CONNECTION is true and SSL is true.
Site 24×7 suggests using a REST API monitor and then asserting the value in JSONPath. I was completely new to this and finding a good clean example was a bit tough, hopefully this saves someone some time. Here’s …
As noted, I work with Legacy and often have to bring in variables from the API that must be sustained across session.. (and I’m sure there might be a better way, comment and advise!). Where I am at now, is I query the API and bring in the variables, but how do I keep from calling these over and over? The old solution was session variables and so, that’s where I am at.
When I started to do this on Core, the most helpful article was this (and it’s in my comments):
He leads you through the basic setup of a HttpContext helper class (that I still use today) and how to configure the startup.. Today, though, I came across a problem: I was able to Set session variables, but the Get was pulling null.
Order. Yes, you’ll see 1000 stackflow responses about order in Configure (and I was careful to do this in that method), but now in ConfigureServices (contrary to the example, as I am now using Core 2.2?), order again comes into play:
public void ConfigureServices(IServiceCollection services)
//this MUST be before add mvc or session returns null