Wednesday, July 21, 2021

Move-Mouse -- Keep your screen awake, session from dying, etc.

This was a fun project I did awhile ago.  I work in industries that typically have higher IT security standards.  However, much of that is bureaucracy and the org just wants to check a box -- as in this case.

The org I wrote this for had a security policy that required a screen idle timeout and session-logoff after 15 minutes.  That's good so that people don't stay logged into something in perpetuity; however, it becomes a pain in the neck for any process/operation that takes longer than 15 mins (e.g. hours).  

So when I offered to solve the problem, they got approval from their security team and I provided a mouse-mover in PowerShell.

This was an interesting exercise as it really forced me to get a better understanding of both screen positioning and actions that interact with Windows built-in idle-tracking.  

As an example of one of my failed experiments, I could not locate any .NET class that would interact with Windows idle-tracking.  It moved the mouse or entered keys successfully but still allowed the session to be terminated by the security policy.  

So I went a little deeper and decided to use Platform Invocation (P/Invoke) and while that did work to successfully move the mouse and do it in a way that also was Windows-idle-timeout-aware, it added a slightly new layer of complexity.  In the end, that didn't matter but it's worth noting.

System.Windows.Forms.Cursor represents a position that manipulates pixel position.

user32.dll::Mouse_Event.MoveTo() represents a point on a 65535 x 65535 grid called a 'mickey'. 

So... if you choose to emit the mouse position but set your $XY to something really small, you won't actually see the mouse move and the numbers reported to the screen won't reflect any change.

Again, this is because if you use the default of moving 1-mickey, the movement is too small to register from the .NET Cursor class.  

The logic employed oscillates the mouse back and forth.  Initial implementations didn't negate the previous movement so overtime, the mouse would drift into oblivion.  ;)

Examples:


## Move the mouse to keep the screen awake:
Move-Mouse -XY 1 -Secs 1 -LoopInfinite $true -DisplayPosition $true

## Move the mouse once:
Move-Mouse -XY 1 -Secs 1 -DisplayPosition $true

## Get frustrated!!
## Remember that CTRL-C is your friend.  ;)
Move-Mouse -XY 100 -Secs 1 -LoopInfinite $true -DisplayPosition $true


Function Move-Mouse {
param (
    ## Declare variables
    [uint16] $XY=1,  ## Mouse Position provided to both x and y axis
    [int32] $Secs = 5, ## Number of seconds to sleep between mouse movements when LoopInfinite is defined
    [boolean] $LoopInfinite = $false,  ## Determines whether to loop infinitely or not
    [boolean] $DisplayPosition = $false  ## Determines whether to write the mouse's pixel location to the screen.
)

begin {

    ## Use a .NET type defintion to access P/Invoke for the appropriate DLL and function.
    $typedef = @"
using System.Runtime.InteropServices;

namespace PoSh
{
    public static class Mouse
    {
        [DllImport("user32.dll")]
        static extern void mouse_event(int dwFlags, int dx, int dy, int dwData, int dwExtraInfo);

        private const int MOUSEEVENTF_MOVE = 0x0001;

        public static void MoveTo(int x, int y)
        {
            mouse_event(MOUSEEVENTF_MOVE, x, y, 0, 0);
        }
    }
}
"@
    ## Load the type defintion into memory
    Add-Type -TypeDefinition $typedef

}

process {

    ## Determine if we want to loop infinitely, default is false
    if ($LoopInfinite) {
        
        $i = 1
        while ($true) {
            ## Write the pixel location to screen
            if ($DisplayPosition) { Write-Host "$([System.Windows.Forms.Cursor]::Position.X),$([System.Windows.Forms.Cursor]::Position.Y)" }
            
            ## Use modulo to alternate the mouse movement back and forth, default is a relative 1,1 and then -1,-1
            if (($i % 2) -eq 0) {
                [PoSh.Mouse]::MoveTo($XY, $XY)
                $i++
            } else {
                [PoSh.Mouse]::MoveTo(-$XY, -$XY)
                $i--
            }

            Start-Sleep -Seconds $Secs
        }
    } else {
        if ($DisplayPosition) { Write-Host "$([System.Windows.Forms.Cursor]::Position.X),$([System.Windows.Forms.Cursor]::Position.Y)" }
    
        [PoSh.Mouse]::MoveTo($XY, $XY)
    }
}

}

Hackem Up! Disassembling Files into Chunks and Recombining Files from Chunks

So... I know I don't add much to my blog and for that, I apologize.   However, I had to do something recently that I thought my two subscribers might like.  ;)

I came across a situation where I needed to move many large files between disconnected environments and despite it being 2021, data transfer speeds can still be atrocious.

I was trying to move multiple 5GB-100GB disk backups from one state to another.  Aside from each copy being horrendously slow, when you experience a problem with an upload (e.g. timeout) at 45/100GB of a file and you lose that 8 hours, it can be rather frustrating.

However, if I chop the large files up into smaller fragments, and move (or sync) those fragments, I reduced the likelihood of running into a terminating condition or even, if I did, since each fragment is a small part of the whole, it wouldn't take as long to recover.

I tried to find a free tool or something that already existed but all of the tools I could locate either cost money or didn't do what I wanted to do, so I decided to write some code that would chop up the files myself.

And correspondingly, I needed to be able to recombine the fragments on the other side to have an identical file as was intended in the original source.  So without further adieu, here is Chunk-File and Recombine-File:

Examples:

## Split the file into fragments
Chunk-File -FileName somefile.ext -ChunkSize 1GB
Chunk-File -FileName somefile.ext
## Recombine the file; the recombined file will have a '_new' name on it
Recombine-File -PathToChunks 'some-directory-path'
Recombine-File

## Verify the bytes were written back in the correct order
Get-FileHash -Algorithm MD5 sourcefile, sourcefile_new

NOTE:

Recombine-File does *not* delete the chunks.  This was intentional.  If any exception gets thrown during the recombine effort, I wanted to provide a non-destructive means of being able to try again (without having to re-copy the fragments).

Chunk-File:

function Chunk-File {
param (
    [Parameter(Mandatory=$true)][System.String]$FileName,
    [Parameter(Mandatory=$false)][uint64]$ChunkSize
)

    try {
        ## Get a file object reference to the passed in filename.
        $File = Get-Item $FileName

        ## Open a filestream handle to the file
        $fs = New-Object System.IO.FileStream($File.FullName, [System.IO.FileMode]::Open)

        ## If a desired-size is not specified, automatically determine an appropriate chunk size
        if (-not($ChunkSize)) {
            if ($fs.length -gt 10GB) {    
                $ChunkSize = 10GB
            } elseif ($fs.length -gt 1GB) { 
                $ChunkSize = 1GB
            } elseif ($fs.length -gt 100MB) {                                
                $ChunkSize = 100MB 
            } elseif ($fs.length -gt 10MB) { 
                $ChunkSize = 10MB
            } elseif ($fs.length -gt 1MB) {                                
                $ChunkSize = 1MB 
            } elseif ($fs.length -gt 100KB) { 
                $ChunkSize = 100KB
            } elseif ($fs.length -gt 10KB) {                                
                $ChunkSize = 10KB 
            } elseif ($fs.length -gt 1KB) { 
                $ChunkSize = 1KB
            } else {
                $ChunkSize = 1
            }
        }
        
        ## Ensure the chunk size isn't larger than the filesize
        if ($ChunkSize -gt $fs.Length) {
            Write-Error "Chunk size should not be larger than the file size."
            break
        }

        ## Determine acceptable buffer size for speed/efficiency
        if ($fs.length -gt 1GB) {    
            $BufferSize = 1MB
        } elseif ($fs.length -gt 1MB) { 
            $BufferSize = 1KB
        } else {                                
            $BufferSize = 1 ## 1B buffer
        }

        #Write-Host "ChunkSize:  $ChunkSize"
        #Write-Host "BufferSize: $BufferSize"

        ## Set the first buffer size
        $buffer = New-Object byte[] ($BufferSize)
        
        ## Set some predefined parameters for use with the chunking
        $FileIncrement = 1
        $ZeroPadSize = ([int]($fs.Length / $ChunkSize)).ToString().Length + 1

        ## Set the auto-increment and auto-decrement values
        $BytesToRead = $fs.Length
        $BytesRead = 0

        ## Open a filestream handle to the first output fragment
        $cfs = New-Object System.IO.FileStream(("$($File.Directory)\$($File.BaseName)_$("$FileIncrement".PadLeft($ZeroPadSize, '0'))$($File.Extension)"), [System.IO.FileMode]::OpenOrCreate)

        ## Iterate through the source file to completion
        while ($BytesToRead -gt 0) {
            
            ## If the chunk file has reached the desired chunk size, close it and open the next chunk
            if ($BytesRead -gt 0 -and $BytesRead % $ChunkSize -eq 0) {
                $cfs.Dispose()
                $FileIncrement++
                $cfs = New-Object System.IO.FileStream(("$($File.Directory)\$($File.BaseName)_$("$FileIncrement".PadLeft($ZeroPadSize, '0'))$($File.Extension)"), [System.IO.FileMode]::OpenOrCreate)
            }
        
            ## Handle the case where the buffer is not a multiple of the file size.
            ## Without limiting the size of the final buffer, the last 'chunk' would be larger than it's supposed to be.
            ## The file would still be in tact and functional, but would contain a padding of zeroes at the end that would change a hash verification on the output file once it's recombined.
            if ($BytesToRead -lt $BufferSize) {
                $buffer = New-Object byte[] ($BytesToRead)
            } else {
                $buffer = New-Object byte[] ($BufferSize)
            }

            ## Read from the source
            [void]$fs.Read($buffer, 0, $buffer.Length)

            ## Write to the fragment
            $cfs.Write($buffer, 0, $buffer.Length)

            ## Increment/Decrement
            $BytesRead += $buffer.Length
            $BytesToRead -= $buffer.Length
        }
    } catch {
        $_
    } finally {
        $fs.Dispose()
        $cfs.Dispose()
    }
}

Recombine-File:


function Recombine-File {
param (
    [Parameter(Mandatory=$false)][System.String]$PathToChunks
)
    try {

        ## Get a collection of file fragments that match the naming convention from 'Chunk-File'
        ## If a path is not provided, the current directory is used
        if (-not($PathToChunks)) {
            $frags = Get-ChildItem | Where-Object { $_.BaseName -match '.*_[0-9]+$' }
        } else {
            $frags = Get-ChildItem $PathToChunks | Where-Object { $_.BaseName -match '.*_[0-9]+$' }
        }

        ## Ensure there are two-or-more fragments to recombine.
        if ($frags.Count -lt 2) {
            Write-Error "No chunks were found to recombine."
            break
        }
        
        ## Create a new file to write all of the fragmented data to
        $tfs = New-Object System.IO.FileStream(("$($frags[0].Directory)\$($frags[0].BaseName.Split('_')[0])_new$($frags[0].Extension)"), [System.IO.FileMode]::OpenOrCreate)

        ## Set an initial buffer
        $BufferSize = 1MB

        ## Iterate through each fragment to write to the new consolidated file
        $frags | ForEach-Object {
        
            ## Open a handle to the fragment
            $frag = New-Object System.IO.FileStream(("$($_.FullName)"), [System.IO.FileMode]::Open)

            ## Set the increment/decrement values for each fragment
            $BytesToRead = $frag.Length
            $BytesRead = 0

            ## Iterate over this fragment
            while ($BytesToRead -gt 0) {

                ## To ensure there's no extra data written to the consolidated file, adjust the buffer size for the final read
                if ($BytesToRead -lt $BufferSize) {
                    $buffer = New-Object byte[] ($BytesToRead)
                } else {
                    $buffer = New-Object byte[] ($BufferSize)
                }
                
                ## Read from the fragment
                [void]$frag.Read($buffer, 0, $buffer.Length)

                ## Write to the consolidated file
                $tfs.Write($buffer, 0, $buffer.Length)

                ## Increment/Decrement
                $BytesRead += $buffer.Length
                $BytesToRead -= $buffer.Length
            }

            $frag.Dispose()
        }

        Write-Output "Recombine successful:  $($tfs.Name)"
    } catch {
        $_
    } finally {
        $frag.Dispose()
        $tfs.Dispose()
    }
}

Tuesday, September 4, 2018

Create-TestFile - Fast and efficient ways to create large quantities of files and/or large-sized test files in PowerShell

Title:  Create-TestFile - Fast and efficient ways to create large quantities of files and/or large-sized test files in PowerShell

Description:  Provides a way to generate 'test' files in a fast and efficient manner.  It works for either creating large files or millions of tiny files.  The 'fill' of the files is either:

  1. 'ByZero' (default) - Fastest - Fills the file with ASCII character position 0, not numeral 0
  2. 'ByRNG' - Slowest - Uses .NET System.Random class to fill the file(s) with random bytes
  3. 'ByCryptoRNG' - Fastest Random/Middle Overall - Uses the .NET Cryptography.RandomNumberGenerator class to file the file(s) with random bytes
Using the 'Verbose' parameter emits parameters, timing, and crypto information about the file but significantly slows down the operation due to the hash calculation.

If an explicit file size is set, the file size should be a byte-aligned number. Otherwise, the dynamic buffer size will be set to 1 and the speed of operation will be crippled.  I can't think of a reason why someone might want to create 2.83MB versus 5MB but I'm sure someone has a use-case.  I'll work on this.


Back-Story:  I was in a debate with a coworker of mine who is an adamant Linux-fan and stereotypically cautious of Microsoft products.  Previously working for Microsoft, I'm understandably a big fan of their products but I also like to understand the 'why' of an argument.

We were in a discussion about file servers and he brought up that NFS was superior to SMB.  His supporting evidence was parroted by a number of links he provided where Linux advocates plainly said that NFS was better but that was it.  The references were all supposition or personal feeling with no empirical data.  So I set out to do some testing on my own (it's incomplete so a future blog post).  In order to do it properly, I needed to be able to emulate different scenarios of moving data.  Not only did I need the ability to test very large files but I also needed the ability to test millions of tiny files.  Furthermore, knowing that some OSs/FileSystems perform data deduplication at the byte and/or block level, I needed to be able to ensure that all files were unique so my speed tests weren't being tainted (much) by a higher level optimization.

After utilizing some Google-Fu, it appeared that there were a number of Windows variants to generate test files that could be purchased but none that I could find that were free or free that didn't have people complaining about the speed with which the files were being created.  So I set out to fix that problem and Create-TestFile is the result.

Examples:


## Create a test file with a specified name, size, and cryptographically random byte fill
Create-TestFile -FileName 'something.txt' -FileSize 10MB -FillMode ByCryptoRNG

## Create 100 test files with random file names, 1KB in size
0..99 | % { Create-TestFile }

## Create 1,000,000 test files with random files name, 10B in size
0..999999 | % { Create-TestFile -FileSize 10 }

## Create 10 test files, with random file names, 2GB in size
0..9 | % { Create-TestFile -FileSize 2GB }

## Create a single large file and look at verbose output which includes user-supplied parameters (if any), determined buffer size, fill type, and hash
## Caution -- using the Verbose flag dramatically increases function time because of the SHA-1 calculation
Create-TestFile -FileSize 2147483648 -Verbose


Script:


function Create-TestFile {
param (
  [Parameter(Mandatory=$false)][System.String]$FileName = "$((Get-Location).Path)\$([System.IO.Path]::GetRandomFileName())",
  [Parameter(Mandatory=$false)][uint64]$FileSize = 1KB,
  [Parameter(Mandatory = $false)] [ValidateSet('ByZero','ByRNG','ByCryptoRNG')] [System.String] $FillMode = 'ByZero'
)

  ## Determine acceptable buffer size for speed/efficiency
  if (($FileSize % (1024*1024)) -eq 0) { ## 1MB buffer  
    $buffsize = 1024*1024
  } elseif (($FileSize % (1024)) -eq 0) { ## 1KB buffer
    $buffsize = 1024
  } elseif (($FileSize % (32)) -eq 0) {  ## 32B buffer
    $buffsize = 32
  } else {                ## 1B buffer
    $buffsize = 1
  }
 
  $fs = $rng = $s1 = $RngMethod = $LoopCount = $buffer = $null
  $s1 = [datetime]::now
  try {
    $fs = New-Object System.IO.FileStream($FileName, [System.IO.FileMode]::OpenOrCreate)

    if ($FillMode -eq 'ByZero') {

      ## Fills the file with ASCII character number 0 -- FASTEST Speed
      $fs.SetLength($FileSize)

    } elseif ($FillMode -eq 'ByRNG') {
   
      ## Fills the file with random bytes generated by System.Random -- SLOWEST Speed, oddly enough
      $rng = New-Object System.Random

      $RngMethod = 'NextBytes'

    } elseif ($FillMode -eq 'ByCryptoRNG') {
   
      ## Fills the file with cryptographically random bytes generated by Cryptography.RandomNumberGenerator -- FASTEST non-zero fill
      $rng = [System.Security.Cryptography.RandomNumberGenerator]::Create()

      $RngMethod = 'GetBytes'

    }


    if ($FillMode -eq 'ByRNG' -or $FillMode -eq 'ByCryptoRNG') {
      $LoopCount = $FileSize / $buffsize
      for ($x=0; $x -lt $LoopCount; $x++) {
        $buffer = New-Object byte[] ($buffsize)
        $rng.$RngMethod($buffer)

        $fs.Write($buffer, 0, $buffer.Length)
      }
    }


  } catch {
    Write-Error $_
  } finally {
   
    ## If the FileStream is open, close it whether or not an error occurred
    if ($fs -ne $null) {
      $fs.Close()
    }

  }
 
  #Get-Item $FileName

  Write-Verbose (
    [pscustomobject][ordered]@{
      'FileName' = $PSBoundParameters['FileName']
      'FileSize' = $PSBoundParameters['FileSize']
      'FillMode' = $PSBoundParameters['FillMode']
      'RngBaseClass' = $rng
      'BufferSize' = $buffsize
      'CreateSecs' = ((([datetime]::Now) - $s1).TotalSeconds)
      'Sha1Hash' = (Get-FileHash -Algorithm SHA1 -Path $FileName).Hash
    }
  )
}







Tuesday, September 12, 2017

Query Security Center v4 API for IAVMs across multiple Nessus Scanners (ACAS)

Title:  Query Security Center v4 API for IAVMs across multiple Nessus Scanners (ACAS)
Description:  Remotely poll X number of scanners for IAVM numbers.

A customer had dozens of Nessus Scanners world-wide and needed IAVM data from each of them in a most up-to-date fashion.  Apparently Tenable has some sort of roll-up/replication server or capability but it rarely worked and/or was rarely accurate.  Because of  this, the customer had one person manually logging into the web interface of each Nessus scanner and manually searching for specific IAVMs and then transposing the data to an Excel spreadsheet.  ...On average, they claimed there were anywhere from 15-30 IAVMs they wanted to track.  The Security Center interface only allowed the searcher to query for one IAVM at a time.  This was known to be a full-time job (8 hrs/day) for the person gathering the data.  ...Oh and the report was generated every day.  

After reading up on the Security Center API which covered many of the functions necessary interact with the API in perl/python, I was able to produce a PowerShell script that would poll each server for all of the IAVMs and save them to a CSV.  Not multi-threaded, the script took about 15 minutes to complete.  It's listed below and contains the interactions described above and includes comments that were necessary in the web-page interaction debugging process (e.g. the HTTP POST data) that allowed me to construct the information in the format necessary to interact with the API.

To use:
  1. Copy the script below into a file and save it as whatever-you-want-to-call-it.ps1
    • I named mine Get-NessusV4Report.ps1 but I'm really querying the Security Center and only scanning for IAVMs so call yours something more accurate.
  2. Run the script in the following fashion:
    • PS C:\>whatever-you-want-to-call-it.ps1 -PathToServerList servers.txt -PathToIAVMList iavms.txt
  3. It will prompt for credentials, and assuming you have legitimate nessus scanners in your servers.txt and legitimate IAVMs in your iavms.txt, (and the SecurityCenter version is v4), it will prodcue a CSV to your desktop and let you know when it's done.

Param (
    [parameter(Mandatory=$true)] [ValidateNotNullOrEmpty()] [string]$PathToServerList,
    [parameter(Mandatory=$true)] [ValidateNotNullOrEmpty()] [string]$PathToIAVMList,
    [ValidateNotNullOrEmpty()] [string]$OutputCsv = "$($env:USERPROFILE)\Desktop\NessusSC_VulnerabilitySummary_$(Get-Date -Format 'yyyy-MM-dd').csv"
)


if (-not(Test-Path $PathToIAVMList)) { Write-Warning "File not found, try again:  $($PathToIAVMList)"; break }
if (-not(Test-Path $PathToServerList)) { Write-Warning "File not found, try again:  $($PathToServerList)"; break }
#if (-not(Test-Path $OutputCsv)) { Write-Warning "File not found, try again:  $($OutputCsv)"; break }


#$PathToIAVMList = 'iavms.txt'
#$PathToServerList = 'Servers.txt'

$IAVMList = @([System.IO.File]::ReadAllLines($PathToIAVMList) | Where-Object { $_ -notmatch '^\#' })
$ServerList = @([System.IO.File]::ReadAllLines($PathToServerList) | Where-Object { $_ -notmatch '^\#' })

if ($IAVMList.Count -eq 0) { Write-Warning "IAVMList is empty.  Please populate $($PathToIAVMList) with the IAVM list you want to scan for on each ACAS server."; break }
if ($ServerList.Count -eq 0) { Write-Warning "ServerList is empty.  Please populate $($PathToServerList) with the servers you want to run this query against."; break }


$ValidCred = $false
Do {
    $Username = Read-Host -Prompt "Enter username"
    $Password = Read-Host -Prompt "Enter password" ##The password is provided in plaintext intentionally.  If a SecureString was used, we'd have to convert it back to plaintext to be able to pass to the NESSUS API which would require Administrator elevation.

    if ([System.String]::IsNullOrEmpty($Username) -or [System.String]::IsNullOrEmpty($Password)) { 
        Write-Warning "You must enter a username and password to proceed."; continue
    }

    if ($Credential.UserName -match '\\') {
        Write-Warning "A backslash character was found ('\'), please only supply a username without any domain identification."; continue
    }

    $ValidCred = $true
} Until ($ValidCred)




## These settings are required to successfully establish a connection to an SSL server due to the elevated security posture
## adopted by the org.
[Net.ServicePointManager]::Expect100Continue = $true
[Net.ServicePointManager]::SecurityProtocol = [System.Net.SecurityProtocolType]::Ssl3, [System.Net.SecurityProtocolType]::Tls, [System.Net.SecurityProtocolType]::Tls11, [System.Net.SecurityProtocolType]::Tls12

$Results = @()
#$ServerList = @('acas01')

foreach ($s in $ServerList) {

    Write-Host $s
    if (-not(Test-Connection $s -Count 1 -Quiet)) { Write-Warning "Could not connect to $($s), skipping..."; continue }

    [System.Net.WebRequest]::DefaultWebProxy = $null

    $Login = New-Object PSObject -Property @{
        'username' = $username
        'password' = $password
    
    }

    $ConnectBody = @{
        module = 'auth'
        action = 'login'
        input = (ConvertTo-Json -Compress $Login)
    }
 
    
    try {
        ## Login to the SecurityCenter API -- required by the API
        $ret = Invoke-WebRequest -URI "https://$($s)/request.php" -Method POST -Body $ConnectBody -UseBasicParsing -SessionVariable sv -ErrorAction Stop
    } catch {
        Write-Error $Error[0]
        continue
    }


    if ($ret.StatusCode -ne 200) { 
        Write-Warning "An error occurred with the HTTP request. HTTP Status Code: ($($ret.StatusCode)); HTTP Status Description ($($ret.StatusDescription))"
        continue
    }

    $ApiResponse = $ret.Content | ConvertFrom-Json
    if ($ApiResponse.error_code -ne 0) {
        Write-Warning "The API returned an error trying to authenticate to the server ($($s)).  `r`nAPI Error Code:  ($($ApiResponse.error_code)); `r`nAPI Error Message:  ($($ApiResponse.error_msg)) `r`nConstructed HTTP POST param:  $($ConnectBody.input)"
        continue
    }

 
    # Extract the token
    $resp = (ConvertFrom-Json $ret.Content)
    $token = $resp.response.token
    $sessionid = $resp.response.sessionID


    #$IAVMList = @('2016-B-0036')
    foreach ($IAVMId in $IAVMList) {

        ## Structure the queries into objects that can output JSON compressable format to properly send the query
        $QueryFilters = New-Object PSObject -Property @{
            'filterName' = 'iavmID'
            'operator' = '='
            'value' = $IAVMId
        } 

        $QueryData = @{
            sortDir = 'desc'
            sortField = 'severity'
            endOffset = 29
            tool = 'sumiavm'
            sourceType = 'cumulative'
            filters = '[' + (ConvertTo-Json -Compress $QueryFilters) + ']'
            startOffset = 0
        } 
        
        ## What the filter looks like after compression, if filtering on iavmid 2016-B-0036:
        ## [{"operator":"=","value":"2016-B-0036","filterName":"iavmID"}]


        $QueryBody = @{
            module = 'vuln'
            action = 'query'
            input = (ConvertTo-Json -Compress $QueryData).Replace('\','').Replace('"filters":"', '"filters":').Replace('}]",', '}],')
            token = $token
        }

        ## What the POST data (input property of the QueryBody var) looks like after compression:
        ## {"endOffset":29,"sourceType":"cumulative","filters":"[{\"operator\":\"=\",\"value\":\"2016-B-0036\",\"filterName\":\"iavmID\"}]","sortDir":"desc","sortField":"severity","startOffset":0,"tool":"sumiavm"}

        ## Notice all of the extra encoding of escape characters (\) and quotations in the wrong place.  The replace() filters fixes that.

        ## What the POST data looks like after compression and replacement filters:
        ## {"endOffset":29,"sourceType":"cumulative","filters":[{"operator":"=","value":"2016-B-0036","filterName":"iavmID"}],"sortDir":"desc","sortField":"severity","startOffset":0,"tool":"sumiavm"}

        try {
            $ret = Invoke-WebRequest -URI "https://$($s)/request.php" -Method POST -Headers @{"X-SecurityCenter"="$($token)"} -Body $QueryBody -UseBasicParsing -WebSession $sv -ErrorAction Stop
        } catch {
            Write-Error "An error occurred connecting to server ($($s)), with error:  $($error[0])"
        }
        #$ret

        if ($ret.StatusCode -ne 200) { 
            Write-Warning "An error occurred with the HTTP request. HTTP Status Code: ($($ret.StatusCode)); HTTP Status Description ($($ret.StatusDescription))"; continue
        }

        $ApiResponse = $ret.Content | ConvertFrom-Json
        if ($ApiResponse.error_code -ne 0) {
            Write-Warning "The API returned an error trying to authenticate to the server ($($s)).  API Error Code:  ($($ApiResponse.error_code)); API Error Message:  ($($ApiResponse.error_msg))"; continue
        }

        # Extract data from response.  The response comes back URL Encoded (e.g. single-line) so to convert that back to an object that PowerShell very easily manipulates, JSON works perfectly.
        $data = (ConvertFrom-Json ($ret.Content)).response


        ## Write the results to the output object. The results are empty if 0 are returned so we need a slightly different object to report 0.
        if ($data.totalRecords -eq 0) {
            $Results += New-Object PSObject -Property @{
                'IAVMId' = $IAVMId
                'Severity' = ''
                'Total' = $data.totalRecords
                'HostTotal' = $data.totalRecords
                'Server' = $s
            } | Select-Object IAVMId,Severity,Total,HostTotal,Server
        } else {
            $Results += New-Object PSObject -Property @{
                'IAVMId' = $data.results.iavmId
                'Severity' = $data.results.severity
                'Total' = $data.results.total
                'HostTotal' = $data.results.hostTotal
                'Server' = $s
            } | Select-Object IAVMId,Severity,Total,HostTotal,Server
        }

        ## Manually wipe out variables so we don't get errant data from a previous query added to a query that didn't return correctly.
        $QueryFilters=$QueryBody=$QueryData=$ApiResponse=$data=$ret=$null
    }

    #$ConnectBody = @{
    #    module = 'auth'
    #    action = 'logout'
    #    input = '[]'
    #    token = $token
    #}
 
    # Login to the SecurityCenter
    #$ret = Invoke-WebRequest -URI "https://$($s)/request.php" -Method POST  -Body $ConnectBody -UseBasicParsing -SessionVariable sv
}

$Results | Export-Csv -NoTypeInformation $OutputCsv

Write-Output "Results have been copied to:  $($OutputCsv)"


## Extracted HTTP POST data for Nessus Security Center report generation.
#{"sortDir":"desc","sortField":"severity","endOffset":29,"tool":"sumiavm","sourceType":"cumulative","filters":[{"filterName":"iavmID","operator":"=","value":"2016-B-0036"}],"startOffset":0}

How to Configure a Scheduled Task to run every X Seconds

Title:  How to Configure a Scheduled Task to run every X Seconds
Description:  Windows Task Scheduler only allows for easy configuration of the quickest run-time of every 5 minutes (Trigger Configuration:  Daily occurrence with specified start-time and Task Repeat set to 5 minutes).  The workaround for this problem is to configure X number of TRIGGERS to achieve the desired task-execution frequency.

As an example, I wrote a network polling mechanism for a small/closed-network to determine host uptime.  The customer wanted a frequency greater than every 5 minutes--actually they wanted it to run every 10 seconds.  So, I had to be able to create a mechanism to support this.  I found some posts talking about Microsoft wanting anything < 5 minutes should be a Windows Service--but I shouldn't have to write compiled code with installers just to run a few-line PowerShell script.  So the easy answer is to configure tons of triggers to run with nearly identical configuration:


The key duplicated Trigger configuration parameters can be seen in the image, and are:

  1. Set the schedule setting to Daily.
  2. Ensure the recurrence is set to 1 day.
  3. Check the box to 'Repeat task every' and set it to 5 minutes and for a duration of 1 day.
  4. Ensure the Trigger is Enabled.
For every additional Trigger, increment the Start time from the previous Trigger by X mins or secs, depending on the desired interval.  In my case, the customer wanted my polling script to get metrics every 10 seconds so I had 30 Triggers configured with the following start times:


2017-09-11T00:00:00
2017-09-11T00:00:10
2017-09-11T00:00:20
2017-09-11T00:00:30
2017-09-11T00:00:40
2017-09-11T00:00:50
2017-09-11T00:01:00
2017-09-11T00:01:10
2017-09-11T00:01:20
2017-09-11T00:01:30
2017-09-11T00:01:40
2017-09-11T00:01:50
2017-09-11T00:02:00
2017-09-11T00:02:10
2017-09-11T00:02:20
2017-09-11T00:02:30
2017-09-11T00:02:40
2017-09-11T00:02:50
2017-09-11T00:03:00
2017-09-11T00:03:10
2017-09-11T00:03:20
2017-09-11T00:03:30
2017-09-11T00:03:40
2017-09-11T00:03:50
2017-09-11T00:04:00
2017-09-11T00:04:10
2017-09-11T00:04:20
2017-09-11T00:04:30
2017-09-11T00:04:40
2017-09-11T00:04:50


...which gave me the desired 10 second intervals to run the script.  Also, notice that I set the very first Trigger to start at Midnight.  This was intentional and served two purposes:
  1. It's easier for me to count by 5s/10s, starting from zero.  ;)
  2. I wanted to set this time in the past so that one of the remaining settings would be relevant for testing purposes (see below).
The final settings that are used to test all of this is on the Task itself.  Under Settings, ensure the following two settings are checked:
  1. Allow task to be run on demand
  2. Run task as soon as possible after a scheduled start is missed
In this way, once you click OK to the Task's creation, it should start firing which is what I wanted.

Below is the Task I used to test this configuration before using it in production.  It simply runs at the specified interval and writes the current datetime to a text file on your desktop so you can see the incremental interval.  It might save you a few dozen clicks in the Trigger creation interface.

To import it, simply save it as an .xml file and in Task Scheduler, right-click Import the file.


<?xml version="1.0" encoding="UTF-16"?>
<Task version="1.2" xmlns="http://schemas.microsoft.com/windows/2004/02/mit/task">
  <RegistrationInfo>
    <Date>2017-09-11T13:50:23.6011472</Date>
    <Author></Author>
    <URI>\Test</URI>
  </RegistrationInfo>
  <Triggers>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:00:00</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:00:10</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:00:20</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:00:30</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:00:40</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:00:50</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:01:00</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:01:10</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:01:20</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:01:30</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:01:40</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:01:50</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:02:00</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:02:10</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:02:20</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:02:30</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:02:40</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:02:50</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:03:00</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:03:10</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:03:20</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:03:30</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:03:40</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:03:50</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:04:00</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:04:10</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:04:20</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:04:30</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:04:40</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT5M</Interval>
        <Duration>P1D</Duration>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2017-09-11T00:04:50</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
  </Triggers>
  <Principals>
    <Principal id="Author">
      <UserId>S-1-5-21-1271409858-1095883707-2794662393-1454529</UserId>
      <LogonType>InteractiveToken</LogonType>
      <RunLevel>LeastPrivilege</RunLevel>
    </Principal>
  </Principals>
  <Settings>
    <MultipleInstancesPolicy>IgnoreNew</MultipleInstancesPolicy>
    <DisallowStartIfOnBatteries>true</DisallowStartIfOnBatteries>
    <StopIfGoingOnBatteries>true</StopIfGoingOnBatteries>
    <AllowHardTerminate>true</AllowHardTerminate>
    <StartWhenAvailable>true</StartWhenAvailable>
    <RunOnlyIfNetworkAvailable>false</RunOnlyIfNetworkAvailable>
    <IdleSettings>
      <StopOnIdleEnd>true</StopOnIdleEnd>
      <RestartOnIdle>false</RestartOnIdle>
    </IdleSettings>
    <AllowStartOnDemand>true</AllowStartOnDemand>
    <Enabled>true</Enabled>
    <Hidden>false</Hidden>
    <RunOnlyIfIdle>false</RunOnlyIfIdle>
    <WakeToRun>false</WakeToRun>
    <ExecutionTimeLimit>PT72H</ExecutionTimeLimit>
    <Priority>7</Priority>
  </Settings>
  <Actions Context="Author">
    <Exec>
      <Command>powershell.exe</Command>
      <Arguments>-Command "&amp; { [datetime]::now.ToString() &gt;&gt; $env:UserProfile\Desktop\test_timer.txt } "</Arguments>
    </Exec>
  </Actions>
</Task>



Monday, June 5, 2017

PowerShell - Force Idle User Logoff After X Hours

I came across a customer who wanted to log users off of their machines after an idle period.  After getting a few more details, it turns out they really just wanted the ability to prevent users from staying logged in forever.

Since they were already using Windows' security feature of locking the screen on idle (https://technet.microsoft.com/en-us/library/jj966265(v=ws.11).aspx), this became a rather trivial task.

The logic works on the following design-parameters:
  • Run continuously throughout users' logon sessions.
  • Logoff users if they become idle for roughly 8 hours.
To accomplish this, the following logic was employed:

  • Detect an "idle" event
    • In this case, we're using the system's own idle-detection, resulting in a screen lock, which creates a system-level event that we can hook and respond to upon occurrence.
  • Respond to each event type
    • On screen-lock, we need to create a countdown timer that performs work (the logoff) if the countdown completes.
    • On screen-unlock, we need to be able to destroy/reset the timer so that it can handle being locked/unlocked multiple times throughout a work day and still fire once a user locks their screen to go home for the day.
To run the script, it needs to be executed in some way at user-logon.  That can be accomplished very easily in one of two standard ways:
  1. A Logon Script.  Group Policy or Active Directory User Account Properties (deprecated)
  2. A Scheduled Task.  The trigger for the scheduled task should be, 'At user logon'
A few acknowledgements I'd like to add:
  • A smart user will be able to figure out what this script does and because it runs under their user context, can kill this process at-will.  There are a few options in this realm:
    • Log the relevant events (see below) -- Logging has been added.
    • Create a Windows service to perform this function which would require them to at least be an Administrator before they could kill the process

Change Log:
  1. Edited script to pop up a message to the user if their last session had been forcefully logged off by this script.

Script:
######################################################################### ## Title: Force-LogoffAfterX.ps1 ## Author: Cameron Wilson (thepip3r) ## Create-Date: 2017-06-05 ## Description: Logs off users after a given timeframe once a screen ## lock occurs. This essentially becomes an idle-logoff. ## PowerShell V: 2.0+ ## Environment: Intended to be run infinitely during the user's logon session. ## Must run under the logged-on users' context. Expects ## to be used in tandem with idle screen lock security feature. ######################################################################### $ScriptName = (Split-Path -Leaf $MyInvocation.MyCommand.Path) ## Define Logoff Hours Interval $global:LogoffHours = 8 $global:LogPath = "$($env:TEMP)\$($env:COMPUTERNAME)_$($ScriptName).log" ## Stage the global objects required for the multiple disparate event tracking $global:Timer = New-Object Timers.Timer $global:Timer.Interval = ($global:LogoffHours*60*60*1000) #$global:Timer.Interval = (20000) ## 20 seconds for testing $global:Job = $null function global:ScriptLog ([string]$msg) { Write-Host $msg "$(Get-Date -UFormat ""%Y%m%d_%H%M%S"") - $($msg)" | Out-File -Append $global:LogPath } ## Define the event handler with a ScriptBlock $EventHandle = { param( [Microsoft.Win32.SessionSwitchReason]$EventReason ) ## Handle each event, as required. ## On SessionLock, create the a timer object that if it's "elapsed" period occurs, execute the force-logoff if ($EventReason -eq [Microsoft.Win32.SessionSwitchReason]::SessionLock) { ScriptLog "A SessionLock event has occurred." $global:Job = Register-ObjectEvent -InputObject $global:Timer -SourceIdentifier "LockedScreenTimeoutWorker" -EventName Elapsed -Action { ScriptLog "Idle timer object fired after $($global:Timer.Interval) milliseconds. User will be forcibly logged off." (Get-WmiObject Win32_OperatingSystem -EnableAllPrivileges).win32shutdown(4) } $global:Timer.Start() ## On SessionUnlock, we need to kill the logoff timer and reset the timer in case the screen re-locks } elseif ($EventReason -eq [Microsoft.Win32.SessionSwitchReason]::SessionUnlock) { ScriptLog "A SessionUnlock event has occurred." Unregister-Event -SourceIdentifier $global:Job.Name -Force $global:Timer.Stop() } else { ## Unhandled sessionswitch event } } if ([System.IO.File]::Exists($global:LogPath)) { $log = [System.IO.File]::ReadAllLines($global:LogPath) if ($log[-1] -match 'User will be forcibly logged off\.') { $t = $log[-1].Split(' - ') [System.Windows.Forms.MessageBox]::Show("Your previous session was logged off due to inactivity on/at: $($t[0])") } } try { ## Create the initial even subscript for the SystemEvents objects to watch for the different "SessionSwitch" events. $SystemEvent = [Microsoft.Win32.SystemEvents] $lstw = Register-ObjectEvent -InputObject $SystemEvent -SourceIdentifier "LockedScreenTimeoutWatcher" -EventName "SessionSwitch" -Action { $EventHandle.Invoke($args[1].Reason) } -ErrorAction Stop ScriptLog "Successfully created the 'SessionSwitch' SystemEvent hook." } catch { ScriptLog "An error occurred trying to register the 'SessionSwitch', SystemEvent hook: $_" } while (1) {}

Thursday, June 7, 2012

Windows DNS logging quirk if you specify alternate path

In Windows DNS, the default logging path is in %SystemRoot%.  However, if you change this path (e.g. D:\DNS\dns.log), you need to make sure you do one critical thing:

CREATE THE NECESSARY FOLDER STRUCTURE!!!

In the case of the path I listed above, it is implied that there is a "D:" drive but you need to create a folder called "DNS" before you cycle the dns service.  Cycling the DNS service will create the "dns.log" file but not the folders in the path.

This may seem obvious to some but there is no indication that this doesn't work.  There is no, "This folder doesn't exist, would you like to create it?" -- or whatever Microsoft usually says whenever you specify a path that doesn't exist.  The service itself doesn't complain, it's just configured to log and simply never does.

In a nuthsell, the service WON'T create any missing folder structure but WILL create the log file itself.