Comics Catalog – My first Web API

I’m a good comic reader since I was young. I have a lot of comics, mainly Marvel, and I would like a way to easely catalog them a browse when needed. For this reason I would like to create an app where to add and catalog them. Let’s start then from the smaller concept possible: a table on a datatbase (SQL Server) defining a first embrional concept of Comic like the below:

  • ID: identifier unique
  • Series: name of the Comic serie
  • Title: title of the specic comic
  • Number: progressive number of the comic within the Series

I’m creating this on an empty SQL Database calog and in a Table named “Comic”

Comic Table

Let’s move now to VisualStudio 2019 IDE to create the Web API Project

ASP NET Web Application

I will name the project XinCataLogAPI

Create Project

As target Framework I’m chosing 4.7.2 the latest one, for the template we need to choose Web API with no authentication (I saw in a lot of example this is not used, maybe I’m wrong).

Web API Template

Ok, now the empty projet is created, I’ll build the project and run it in debug and voilà, the default WebApp is running.

ASP.NET Web App

let’s add the project to the GitHub repository to ensure have a Source Control repository where to push and pull the versions.

Since we need to play with the EntityFramework to retrieve and manage the CRUD operation I’ll add a new Item to the solution under the MODEL Folder chosing the “ADO.NET Entity Data Model” and I’ll name it DBModel

ADO.NET Entity

Now in the wizard which is opening let’s chose the Database First option:

Database First Option

In the modal opening select the DB

Database Connection

As option I chose the Entity Framework 6.x and then from the list of the object the Table XinComic:

XinComic Table

Once pressed Finish the Diagram will popup showing the object selected and will generate a set of items under the Model Folder.

Model

Before move ahead let’s spend a minute on two of the classes generated by the wizard:

  • DBModel.Context.cs class is the one which is used to communicate with the Database
  • XinComic.cs is the entity representing the Database table XinComic

Now that we have the classes which model the DB and create the DB we can move ahead. Tipically an API can provide the basic functionality to access to Database objects and more in details the four main HTTP methods (GET, PUT, POST, and DELETE) can be mapped to CRUD operations as follows (see also here [2]):

  • GET retrieves the representation of the resource at a specified URI. GET should have no side effects on the server.
  • PUT updates a resource at a specified URI. PUT can also be used to create a new resource at a specified URI, if the server allows clients to specify new URIs. For this tutorial, the API will not support creation through PUT.
  • POST creates a new resource. The server assigns the URI for the new object and returns this URI as part of the response message.
  • DELETE deletes a resource at a specified URI.

Since we have already a Model class created as above we can now move to create a Controller to implement the functionality above. To this let’s click on the Controller Folder and add a new controller like below

Web API Controller with Read/Write actions

In the modal opening I just need then to specify the Model and the Repository created before

XinComicsController

Then, after some actions from the wizard you should get the Controller correctly created: now in the controller folder I have a all the CRUD operation for my Comic:

XinComicsController

Ok, looks good, now if we rebuild the project, pressing the API top link we are direct to a page containing the list of the methods available:

List of API and Methods

If we now put the url of the Get Method we can test the functionality:

Get Method result

That is great but if we want to test the methods which requires a body it is not so straightful. In this case you may need to use an tool like Postman or Reqbin which can so the request to simulate the payloads like you can see below:

API POST Test

This can prove you the service is correctly replying. Another possibility, which a I prefer, is to leverage on Swagger to have a very intuitive way to test the API. I f you are interested please have a look to this post [4]. If you would have a look to this code please check this out [5] in my GitHub repository.

[1] https://docs.microsoft.com/en-us/answers/questions/357012/can39t-find-adonet-entity-data-model-missing-visua.html

[2] https://docs.microsoft.com/en-us/aspnet/web-api/overview/older-versions/creating-a-web-api-that-supports-crud-operations

[3] https://stackoverflow.com/questions/45139243/asp-net-core-scaffolding-does-not-work-in-vs-2017

[4] https://www.beren.it/en/2022/03/26/comic-catalog-my-first-swagger-web-api/

[5] https://github.com/stepperxin/XinCataLogAPI

Comic Catalog – My first Swagger Web API

I’m a good comic reader since I was young. I have a lot of comics, mainly Marvel, and I would like a way to easely catalog them a browse when needed. For this reason I would like to create an app where to add and catalog them. Let’s start then from the smaller concept possible: a table on a datatbase (SQL Server) defining a first embrional concept of Comic like the below:

  • ID: identifier unique
  • Series: name of the Comic serie
  • Title: title of the specic comic
  • Number: progressive number of the comic within the Series

I’m creating this on an empty SQL Database calog and in a Table named “Comic”

Comic Table

Let’s move now to VisualStudio 2019 IDE to create the Web API Project

ASP NET Core Web API Template

I will name the project XinCataLogAPI

Create Project

As target Framework I’m chosing the 5.0 which is the current one, for the moment I’ll leave the Authentication Type to None (I see on the web multiple example which are not using it)

Framework and Authentication

Ok, now the empty projet is created with a demo class WeatherForecast to demonstrate it can work. Let’s run then the project and here we go:

Swagger API interface

We have the Swagger repesentation for our WebAPI, of course the methods are just a demo, but as starting point this is cool enough.

Let’s add the project to the GitHub repository to ensure have a Source Control repository where to push and pull the versions. This is just a seiggestion but I strognly recommed to push a checkpoint all the time you have a version stable enough so you are able to comeback in case something you did (even unintentionally) mess-up your work. Since we need to play with the EntityFramework to retrieve and manage the CRUD operation on SQL Server I will add the NuGet related package taking care of the latest compatible version.

EntityFrameworkCore 5.0.15

Then we need to do this also for the packages of Tools and SQLServer. After this the full list of packages installed should be the below:

NuGet Packages

In the past I remember I played with an useful wizard to work with Database First approach, which basically generate a Data model starting from the Database I need to work with. Unfortunately I was not able to retrieve it and looks like this is because with .NET Core there is no way to add ADO Entity Model ad object [1]. Anyway there is the possibility to do the same leveraging on some command on the Package Manager Console. Let’s open it then and add the following command:

Scaffold-DBContext "Server=WIN-7O15VR47QA6;Database=XinCataLog;Trusted_Connection=True;"  Microsoft.EntityFrameworkCore.SqlServer -OutputDir "Models" -Tables XinComic -f  

This command tells which entities (in this case only the XinComic) to scaffold in the Models directory and you can see below the result.

Package Manager Command
Models Folder

The XinCataLogContext class is the one which is used to communicate with the Database, while the XinComic is the entity representing the Database table.

Tipically an API can provide the basic functionality to access to Database objects and more in details the four main HTTP methods (GET, PUT, POST, and DELETE) can be mapped to CRUD operations as follows (see also here [2]):

  • GET retrieves the representation of the resource at a specified URI. GET should have no side effects on the server.
  • PUT updates a resource at a specified URI. PUT can also be used to create a new resource at a specified URI, if the server allows clients to specify new URIs. For this tutorial, the API will not support creation through PUT.
  • POST creates a new resource. The server assigns the URI for the new object and returns this URI as part of the response message.
  • DELETE deletes a resource at a specified URI.

Since we have already a Model class created as above we can now move to create a Controller to implement the functionality above. To this let’s click on the Controller Folder and add a new controller like below

API Controller with Actions using EntityFramework

In the modal opening I just need then to specify the Model and the Repository created before

XinComicsController

Then, after some actions from the wizard you should get the Controller correctly created.

Actually the first time I did it I got a weird error “Unhandled exception. System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.VisualStudio.Web.CodeGeneration.Utils, Version=5.0.2.0, Culture=neutral, PublicKeyToken=adb9793829ddae60’. The system cannot find the file specified. File name: ‘Microsoft.VisualStudio.Web.CodeGeneration.Utils, Version=5.0.2.0, Culture=neutral, PublicKeyToken=adb9793829ddae60′” which fortunately I solved with the help of StackOverflow [3]. If it is not the case for you better, otherwise this fixed the issue for me.

Here we go, now in the controller folder I have a all the CRUD operation for my Comic:

XinComicsController

Well we should be almost there but if run the application we do find the new methods within the Swagger interface

XinComics methods

but if we run the get method we got an error:

System.InvalidOperationException: Unable to resolve service for type 'XinCataLogSwaggerWebAPI.Models.XinCataLogContext' while attempting to activate 'XinCataLogSwaggerWebAPI.Controllers.XinComicsController'.

which is meaning that we have an issues with the initializzation of the XinCataLogContext. Actually we didn’t register the DataContext within the services and also described here [4].

        // This method gets called by the runtime. Use this method to add services to the container.
        public void ConfigureServices(IServiceCollection services)
        {
            services.AddDbContext<XinCataLogContext>(options =>
              options.UseSqlServer(Configuration.GetConnectionString("DefaultSQLServerConnection")));

            services.AddControllers();
            services.AddSwaggerGen(c =>
            {
                c.SwaggerDoc("v1", new OpenApiInfo { Title = "XinCataLogSwaggerWebAPI", Version = "v1" });
            });
        }

The first line of the method is registering as DBContext the one we generated with the Scaffold-DBContext. Now before run again the aplication let’s add a connectionstring in the appsettings file to ensure the connection string is the right one.

  "ConnectionStrings": {
    "DefaultSQLServerConnection": "Server=WIN-7O15VR47QA6;Database=XinCataLog;Trusted_Connection=True;"
  }

That’s it: if you now run the app and try the methods they are working good: we got the first API (with swagger) basically writing only coupleof line of code: the large part of the actions are done by VisualStudio.

Get method from Database

[1] https://docs.microsoft.com/en-us/answers/questions/357012/can39t-find-adonet-entity-data-model-missing-visua.html

[2] https://docs.microsoft.com/en-us/aspnet/web-api/overview/older-versions/creating-a-web-api-that-supports-crud-operations

[3] https://stackoverflow.com/questions/45139243/asp-net-core-scaffolding-does-not-work-in-vs-2017

[4] https://docs.microsoft.com/en-us/aspnet/core/data/ef-mvc/intro?view=aspnetcore-6.0#register-the-schoolcontext

XinLog 1.1 – Make it configurable

Still working on the improvement of the Log utility. As first I would like the user to decide where to log. As you may remember from my previous post the logs are all created in the same folder of the script. I twould be better to have the possibility to choose where to place them. for this reason I’m creating an Open-Log function which may tell the logger where to do it.

# Get the current Directory
$_StoragePath = Split-Path -Parent $MyInvocation.MyCommand.Path
#Set the file log name
$_Logfile = "_StoragePath\XinLog_$($env:computername)_$((Get-Date).toString("yyyyMMdd_HHmmss")).log"

function Open-Log{
    Param (   
        [string]$StoragePath
    )
    #set the folder name
    $_StoragePath = $StoragePath
    #Set the file log name
    $_Logfile = "$_StoragePath\XinLog_$($env:computername)_$((Get-Date).toString("yyyyMMdd_HHmmss")).log"
}

If we invoke this Open-Log before use the Write-Log function, I would like the variable $_StoragePath to be intialized and then used each time I call the Write-Log. If you try like this you’ll see this is not properly working. Why? This is because the variable declared as above has a context limited to the same script which means that out of this is no more permanent. This is definetly bad for us since this means we need to re-initialize the XinLog each time.

We need to play with the context of the variables (please have a look to this document [1] which is very interesting). To fix we need to declare those variable as Global, this will extend the context to the whole process. It works!

# Get the current Directory
$global:_StoragePath = Split-Path -Parent $MyInvocation.MyCommand.Path
#Set the file log name
$global:_Logfile = "$global:_StoragePath\XinLog_$($env:computername)_$((Get-Date).toString("yyyyMMdd_HHmmss")).log"

function Open-Log{
    Param (   
        [string]$StoragePath
    )
    #set the folder name
    $global:_StoragePath = $StoragePath
    #Set the file log name
    $global:_Logfile = "$global:_StoragePath\XinLog_$($env:computername)_$((Get-Date).toString("yyyyMMdd_HHmmss")).log"
}

As you can see the only important thing to do is to use the prefix $global: for all the variable we need to extend to the whole process context.

Finally, as you may always need, let’s add some try-catch to avoid the unexpected issues to stop the Log to work. Here we go the XinLog 1.1 is ready to go (GitHub [2])

[1] https://www.varonis.com/blog/powershell-variable-scope

[2] https://github.com/stepperxin/XinLog

XinLog 1.0

Looking back to the thread of the past weeks [1], I think it may be better to spend some minute more to create a separate library for loggin which can be reusable. Indeed logging is one of the most common activities and it would be great this functionality can stay in a separate file which I can just invoke in the context of my script when needed.

For this reason I’ll create a new ps1 file called XinLog where put all the logging logics:

################# XinLog 1.0 ######################
# Use this library to log easely on file and screen
# In Parameters MANDATORY
#   - $LogString [STRING] Message to log
# The log on file will create file a file in the same directory of the caller
###################################################

# Get the current Directory
$myDir = Split-Path -Parent $MyInvocation.MyCommand.Path
#Set the file log name
$Logfile = "$myDir\XinLog_$($env:computername)_$((Get-Date).toString("yyyyMMdd_HHmmss")).log"

#begin FUNCTIONS
function Write-Log
{
    Param (
        [Parameter(Mandatory)]    
        [string]$LogString
    )
    $Stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")
    $LogMessage = "$Stamp - $LogString"
    Add-content $LogFile -value $LogMessage
    Write-Host $LogString -ForegroundColor White
}

#end FUNCTIONS

Actually I did’t do too much, I just extracted the lines related to logging from previous file moving the Write-Log function body which is the log writer.

Now the question is: How can I use Write-Log function if now it is in a separate file? The answer is really straightforward since as it is happening in a large part of languages it is possible to include a script in another one also in PoweShell. To do this we are leveraging on the “dot source notation” (you can find more information here [2]). In our scenario the original file Conto2.3 will change as below:

########### CONTO 2.3.1 ############
# First test
####################################

##### Inclueded Libraries ##########
# Get the current Directory
$myDir = Split-Path -Parent $MyInvocation.MyCommand.Path
# Files included 
. "$myDir\XinLog.ps1"  # Include the file logs
####################################


#begin BODY

Write-Log "Text to write"

#end BODY

Here [3] you can find also a GitHub project I created for this purpose.

[1] https://www.beren.it/2022/01/07/conto2-3-logging-with-powershell-get-started/

[2] https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_scopes?view=powershell-5.1#using-dot-source-notation-with-scope

[3] https://github.com/stepperxin/XinLog

Conto2.3 – Log with PowerShell

As said during the design [1], all the business logic related to the ET part will be in PowerShell. For people which may not know this language I can anticipate it is Microsoft product which can also be installed and used on Linux [2] even I’m not sure if it would make sense. This will give a more flexible and evoluted experience than the regular command line.

Start coding in PowerShell is very easy and you just need an Editor. I would suggest to use Visual Studio Code [3] which is free, light, it support a lot of languages and is also providing a debug mode which may help you a lot in your troubleshooting.

Coming back to our topic on first we need to design a way to log (on screen and on file) the actions the script is doing. To do that we’ll create an ad-hoc function named WriteLog which will print in the shell and on file what we desire.

  • Screen writing will be covered by Write-Host “text to write”
  • File writing will be covered by Add-content File -value “text to write

In the file output we are also adding a timestamp for each row we write.

function WriteLog
{
    Param (
       [Parameter(Mandatory)]    
       [string]$LogString
    )

    $Stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")
    $LogMessage = "$Stamp - $LogString"
    Add-content $LogFile -value $LogMessage
    Write-Host $LogString -ForegroundColor White
}

As first statement you can find the word Param: this is used to delcare the parameter for the function. In this case it is the text to log and it is marked as Mandatory since there is no log without a text to log.

Let’s try it out then

WriteLog "Text to write"

Executing the script from Commnad Line (or in the Visual Studio Code IDE) we’ll get the text we are expecting the response of the shell.

Esempio di WriteLog

Good but not enough since we want also to write to a phisical file and to do this we need to identify the file to write and place it in the same folder where the script is located.

$myDir = Split-Path -Parent $MyInvocation.MyCommand.Path
$Logfile = "$myDir\log_$($env:computername)_$((Get-Date).toString("yyyyMMdd_HHmmss")).log"

In the first row we are getting the path of the folder of the script and store it in the variable . In the second line we are adding the file Log name composed as the machine name plus a timestamp in order to write a different file at each run.

Running again the script we now have a file log in the folder.

Created file log
File log content

Please take note on how is simple to do a string concatenate in PowerShell using variables: $LogMessage = “$Stamp – $LogString” we are just writing the variables (starting with $) within the text. Quick and smart.

Resuming: we just created a first version of the PowerShell Script which is logging at screen and on file a text sample. You may see that to easy read it the function WriteLog is located on top of the script while it will be invoked only later in the bottom part. Thi is not mandatory, the script still works if you put the function declaration after the piece invokin it. It is just a convention.

#begin FUNCTIONS
function WriteLog
{
    Param (
        [Parameter(Mandatory)]    
        [string]$LogString
    )
    $Stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")
    $LogMessage = "$Stamp - $LogString"
    Add-content $LogFile -value $LogMessage
    Write-Host $LogString -ForegroundColor White
}

#end FUNCTIONS

# Get the current Directory
$myDir = Split-Path -Parent $MyInvocation.MyCommand.Path
#Set the file log name
$Logfile = "$myDir\log_$($env:computername)_$((Get-Date).toString("yyyyMMdd_HHmmss")).log"

WriteLog "Text to write"

You can download here [4] the zip with the script to try it on your own.

[1] https://www.beren.it/en/2022/01/07/conto2-3-the-design/

[2] https://docs.microsoft.com/it-it/powershell/scripting/install/installing-powershell-on-linux?view=powershell-7.2

[3] https://code.visualstudio.com/download

[4] https://www.beren.it/wp-content/uploads/2022/01/CONTO2.3-01.zip

Conto2.3 – The design

If you land to this post probably it’s because you have some interest on the topic (see all the requirements here) and we can now move to define more in details how this process should work:

  1. define how to structure the input/output focusing on who does what
  2. define how to configure the above in a way which will gave us an high level of scalability and reusability
  3. design a logging system in order to monitor and analyze which are the actions taken by the system helping the troubleshoting

1) Input and Ouput Definition

Once the information are extracted from the source they need to be exported to an OUTPUT folder also divided in two different folders which will contains all the original files in the same format and without any difference based on the owner, and this will simplify a lot the Exel job

Configurazione Cartelle di output

The last step is on Excel itself which will load those files and will populate some spreadsheets in order to correctly display them in a proper Pivot table.

There is stil one part miss tough which is the Mapping. Who is mapping the information in the files and where this mapping is coming from? Unfortunately this is the most painful part, since my expectation is to manage the mapping in a totally separated Excel file where I can define whether a description is categorized in a way or another. This is the tricky part an the reason why Excel is not enough but we need PowerShell help. Basically, just right after the script run it will load all the mapping information and then it will reuse them to correctly map the expenses in a way easy to be managed by Excel.

2) How to configure the process

All what we said so far it just the base logic but in order to easely maintein and extend the process we need a configuration file which will tell where to find the files and the information within the files. This is key to avoid change code in case of any change in the sources.

3) How to log

Finally, in order to have a good level of details on what we the process did so far we need a way to log the actions for an easy troubleshooting

Root Folder

To resume:

  • Config.xml: it’s the file containing the configuration
  • Conto2.3.ps1: it’s the file where the ET (Extract and transform) logics will be performed
  • Conto2.3.xlsx: it’s the Excel file which will manage the L (Load) and create the Pivot
  • MasterCategories.xlsx: it’s the file used for the mapping during the ET phase



As said we would like to handle different file types coming from different sources: the bank account details, the credit card report and as well the owner: if me and my wife have different banks probably the source file will be different.

Questa immagine ha l'attributo alt vuoto; il nome del file è image-1.png
Source folder architecture

Theorically I would add asd much owner as I want, keeping a structure which may present the bank on first and then other sources. In thos folder I would also load the files as is without doing any manipulation

A PowerShell script which will be in the root folder will load those files and will normalize them. When I say normalize I mean to find a set of columns which are mandatory for the system:

  • Data Operazione: the time of the spend
  • Causale: brief indication of the reason
  • Descrizione: a detailed info around the spend
  • Ammontare: it is the the value of the spend
  • Accredito: I noticed that in some cases we have found inbound placed in a different column from the outbound (Ammontare)

Once the information are extracted from the source they need to be exported to an OUTPUT folder also divided in two different folders which will contains all the original files in the same format and without any difference based on the owner, and this will simplify a lot the Exel job

Configurazione Cartelle di output

The last step is on Excel itself which will load those files and will populate some spreadsheets in order to correctly display them in a proper Pivot table.

There is stil one part miss tough which is the Mapping. Who is mapping the information in the files and where this mapping is coming from? Unfortunately this is the most painful part, since my expectation is to manage the mapping in a totally separated Excel file where I can define whether a description is categorized in a way or another. This is the tricky part an the reason why Excel is not enough but we need PowerShell help. Basically, just right after the script run it will load all the mapping information and then it will reuse them to correctly map the expenses in a way easy to be managed by Excel.

2) How to configure the process

All what we said so far it just the base logic but in order to easely maintein and extend the process we need a configuration file which will tell where to find the files and the information within the files. This is key to avoid change code in case of any change in the sources.

3) How to log

Finally, in order to have a good level of details on what we the process did so far we need a way to log the actions for an easy troubleshooting

Root Folder

To resume:

  • Config.xml: it’s the file containing the configuration
  • Conto2.3.ps1: it’s the file where the ET (Extract and transform) logics will be performed
  • Conto2.3.xlsx: it’s the Excel file which will manage the L (Load) and create the Pivot
  • MasterCategories.xlsx: it’s the file used for the mapping during the ET phase



Conto2.3 – Manage Family balance with PowerShell and PowerQuery

The most annoying thing I do about the family balance management is to repeat the almost the same actions each month. There are a plenty of tools already available which may help on this (like Excel) but all of them has some limitation and what I want is a tool which once configured will need only a couple of click to be feed.

Spoiler: As you may see reading this post and also the other following this I’m not 100% sure I reached the target but I did my best.

Let’s start from the requirements:

  1. The source of the expenses are normally coming from different side: my bank account, my wife’s one, the Credi Card balance…
  2. The format of these files can be several, mainly csv and xlsx
  3. The content of these files can be very different: the number and the date format for instance or the name and the order of the columns
  4. What we need is that all these files combined together will be loaded in a unique Excel file on which apply some pivot to easily monitor monthly the status of all the expenses and sort it out easily if a specific entry is higher than expected.
  5. Finally, each month, load the new files in few clicks. Some adjustment in the mapping can be needed of course but not mandatory.

Another key part is the mapping since as is the information stored in the files may not relevant or to granular to extract meaningful information. As an example: I would like that if one of the expenses is “Pizzeria da Mario” (I’m assuming you know what a pizza is) this may be categorized like “Lunch & Dinners” and hopefully all the data containing “Pizza” as well. This means that load data is just not enough we need to transofrm the information in a process that IT guys normally call ETL (Extract Transform and Load).

As I already stated above there are a lot of tools which provide such feature (Excel among the other). Unfortunately, maybe because of my poor skills, I’m not able to find a unique tool which can do all that I want and because of this I prefer to split the actions like below:

  • Extract: Powershell
  • Transform: Powershell
  • Load: Excel

More in details: Powershell scripts will identify the source files, reading and normalizing their contents and map them in a correct way to be correctly aggregated in a Excel file which will contain a pivot easy to manage and accessible even to your wife.

Chapter 9 – The turning point

We approach the game with Maccesfield in emergency: Preece has not yet recovered while Marshall, Walker and Harsly have spent everything in Wednesday’s game. I have to take some risk. I chose to put two full backs as winger: Akhmedov on the left and Blamey on the right. Up front are Ormondroyd and Forrester. The game starts badly, after twenty minutes we lose 1-0. With the current set-up we can not arrange much and at the beginning of the second half I remove the usual apathetic Akhmedov and insert Marshall on the right, moving Blamey to the left. We manage to gain some yards but even inserting Harsley nothing changes. It ends 1-0 and as usual, they scored us with the only shot on goal…. The following Wednesday is Rotherham for the second round of the Windscreen Shield. In attack it is a real disaster 3 out of 4 players are injured and not available. We lose 3-0 and we are out.

3 out of 4 forward are injured

On Saturday we play the second round of the FA Cup and win 4-1 against Stalybridge. A double from Ormondroyd and Preece (MoM) and Crosby goals. It was a easy, the next round will be Ipswich Town of the first division: it will be tough.

Finally some rest to recover some resources lost in the last games. We play the last home leg against Bristol C below us by only 4 points. We dominate but never hit the goal and it basically ends 0-0. Without Harsley and Mainwaring still injured, if the old Ormondroyd doesn’t manage it, we can never hit in the door …

Chapter 8 – The injury

After the serie of home games I prepare for the away match at Barnet. In the meantime, I get an offer for Sertori, the financial situation is quite disastrous, but I cannot deprive myself of the best defender in terms of tackles and headers. I refuse it. Given the economic situation, I renew Marshall, Hope and Ormondroyd who were among the players having a contract shortly expiring. I can’t lose them. There are still 5 players with a short expiring contract: Walker, Clarke, Housham, Wilcox, Shakespear. Clarke and Housham will certainly not be renewed. Wilcox certainly deserves it, but it’s really getting old (now 34) …

The game with Barnet is a disaster, we suffer continuously and we create very little the result is the only thing that can be saved: 2-2. Not a big viaticum in view of the next match at Peterborough. With D’Auria’s return behind the strikers, the first line attacking duo becomes Harsley-Mainwaring. We deservedly win 1-0 with a goal from Harsley. After two months playing every three days there is finally no midweek schedule. We host Darlington and Mainwaring shows up the first hattrick of his young career: it finishes 3-0 with the first MoM. Another week and we face Cardiff away and it ends as it often happened in the past months: 2 shots 2 goals for them and we waste too much. The following Wednesday in the second game of the Windscreen Shield we get a nice 2-2 on the Grimsby field which allows us to leading group.
Three days later we play the first round of the FA challenge at home against Altrincham, the game is very closed and there are not many chances: it ends 0-0 after the 90 minutes. It will have to be repeated in a week.

On November 14th we host Northampton just above us in the standings, it is one of those matches where you need to make points for a thousand different reasons. After three minutes, Mainwaring collapses: shin splints. Northampton dominate and win 4-2. The bad news after the game is terrible: Andrew will be out 4 months…

Mainwaring Injury

The next match is the repetition of the match against Altrincham. Preece has just returned from injury and still not in condition. On the left, I give McAuley a rest and promote Akhmedov. The first half flows without much emotion. At 60′ Marshall leads us ahead. At that point, the Altrincham tries the reaction and a couple of good saves by Colgan lock the final 1-0. The draw for us will face the Stalybridge team of the minor leagues: the third round is within our reach.

Chapter 7 – A star is born

On September 26th we host Lincoln fifth in the standings. Not an easy match, but after two wins in three games it would be important to line up two straight wins. The match remains in balance, but at ’72 we go under and we are not able to equalize. Ormondroyd as attacking midfilder is too far from the goal. Mainwaring plays a good half an hour without producing much. We are playing every three days and therefore on the following Saturday we go to play at York. It ends in a goleada, all the attackers score, Harsley even signs twice and Preece is MoM. Mainwaring also scores his first goal as senior player. We are now back to 13th position.
The following game on the group of the Windscreen Shield trophy takes place on the following Wednesday and we play against Preston, a second division’s team. Due the lower importance of the tournament compared to the championship, I make some changes: I place Hope in place of Wilcox and Mainwaring from the beginning to team up with Harsley. We dominate the match and above all Mainwaring scored both the goals in the final 2-0. A star is born? The next match is on Mansfield. After the break, I put Ormondroyd back in the middle of the attack with Harsley and I keep Walker in midfield opting for a classic 4-4-2. After 2 minutes, Ormondroyd was injured and it was therefore up to Mainwaring to take his place. We win 2-1 with Mainwaring’s decisive goal, the MoM, however, is Colgan. Third victory in a row between all the competitions above all 5 victories in the last 7 matches and tenth position in the standings close to the playoffs area. Does the turning point has finally come?

Theoretically the calendar is good: we have 4 home games in a row and with we could really put in a good pace and stay in a good ranking position. The first is against the penultimate Rotherham and therefore designated victim. We go under almost immediately, Harsley equalized in the 70th minute but 5 minutes later Walker committed the most classic of own goals and instead of three easy points we ended up with nothing at all: 1-2. Three days later, it was Brighton current at the seventh place. Game of great balance like the one with Rotherham and also here we go below, we come back with Ormondroyd but it is not enough and we suffer a 2-1 defeat in the final. The third match is against Shrewsbury sixth in the standings. It ends up 1-1 with the opposing goalkeeper taking 9 as rating! On the positive side, Mainwaring still hits the goal. The latest in the home game series is against Swansea above us by a couple of points in the standings. It is almost the same as in the other matches: it ends 0-1 after having also some good opportunities, probably 1-1 would have been a fairer result but their goalkeeper Jones is MoM. So after one point in four games we find ourselves back in 13th place in the standings. Shit!!!