Selecting a PCI QSA Vendor

credit-cards

If you are a merchant that accepts credit cards, you are required to comply with the requirements of the Payment Card Industry (PCI) Data Security Standards (DSS), and you must demonstrate that compliance each calendar year to your bank.

You can find more information about what that means to your business here. Once you are ready start your compliance effort, you will need to engage a third-party team to help make sure you are making the correct decisions about demonstrating that compliance, working on changes as they occur to make sure you aren’t making poor security decisions, and they will certify your compliance with a standard format report that lists why they think your environment is secure enough for customer transactions called a Report of Compliance (RoC).

f676e-headache

It can be difficult to find the correct partner that can help guide you through this difficult and expensive process, but a little work in the beginning can save you headaches and expenses later in the process.

The first thing to remember is you need to run your business, and that already means having a secure information technology (IT) environment and worrying about adequate security. Being PCI compliant an having a secure network is not the same thing. The PCI DSS provides you a standard framework that list the minimum requirements for implementing a secure IT environment. You have to know your specific requirements and what makes your environment different than the “standard” IT environment. Your Qualified Security Assessor (QSA) needs to have the skills required to understand technology in general, and your specific environment, before thy can perform their job well.

people-skills

Here are three tips to help you choose the right QSA for your PCI Compliance audit:

  • How experienced is the QSA? You don’t want to pay to train your QSA on how to do perform their job. You want someone who already has a few years of experience.
  • What is the approach to the audit used by the QSA? You have to understand how the QSA will interact with your team and how well their approach will work with your culture and corporate environment. The best-case scenario is they fit seamlessly with your employees and have the professionalism and communication skills to work well with the entire team.
  • Will the QSA stand behind their work? If things go bad and there are questions about your security measures, or even a breach of customer data, you want someone who is confident in their security decisions and will defend your decisions.

Pricing is another area that can be hard to compare from company to company. I looked at 5 different companies and they gave me 5 very different pricing solutions. The cheapest was almost half the cost of the most expensive.  There are several factors that will play into pricing and those factors will vary for each company, but your overall network complexity, number of credit card transactions, and requested services can really drive huge swings in pricing. Don’t be afraid to do some comparison shopping to find a vendor with pricing that meets your budget, but the cheapest vendor doesn’t always mean they are the one you need to select.

It can be preferred to pay slightly more if you are going to get much better service and support.

 

Cover Your Laptop’s Webcam

USB Hacks - @SeniorDBA

You may have seen several people covering their laptop webcams, including government officials and a prominent high-profile CEO or two. This may have you asking why they would choose to cover their webcam, and if you should be doing the same thing.

Webcam - SeniorDBA

Hackers want to access any high-profile system, and video taken from a webcam can easily be used for blackmail. Imagine the type of data you might be able to capture from a high-profile CEO, showing him or her working or conversations recorded without them knowing. Hackers can easily generate the most profit if they can capture video or audio to use as blackmail.

While it is unlikely they they would attack your laptop, you could still be a target it you have access to sensitive data or if your recorded activity can be used to gain access to other systems or devices.

Currently, the only way for a hacker to access your webcam is for them to gain access to your computer, which makes the attack similar to any other type of remote attack. You might receive an email with an attachment that secretly installs a Remote Administration Tool, or you might respond to a social engineering attack that convinces you to surrender control via a fake IT support call. Your laptop could be compromised and you wouldn’t even know they have taken control of your webcam, because they can disable the webcam activity LED.

Best Practice Recommendations

  • Keep the webcam lens (usually located at the top center of the laptop screen) covered, with a piece of opaque sticky tape except when actively being used.
  • Keep your laptop closed when it isn’t actively being used.
  • Always your software up to date, especially your web browsers and all associated plug-ins.
  • Enable your firewall at all times.
  • Always run anti-virus and routinely check for malware.
  • Avoid clicking links in emails, even when you know the sender.
  • If you get an email telling you your email account has been compromised or someone needs to verify your security setting, don’t click the link in the email. Contact the site directly.
  • If you get a call from IT asking for access to your computer. Refuse them access and call your internal help desk directly. Ask questions and verify their identity before you allow any remote access.

22 DBA Responsibilities You Should Know About

dba

Being a Database Administrator (DBA) is a tough job, and knowing what responsibilities are involved before you commit to that career can be very important.

In this article by Craig Mullins, we get his list of 22 DBA responsibilities:

  1. General database management.
  2. Data modeling and database design.
  3. Metadata management and repository usage.
  4. Database schema creation and management.
  5. Capacity planning.
  6. Programming and development.
  7. SQL code reviews and walk-throughs.
  8. Performance management and tuning.
  9. Ensuring availability.
  10. Data movement.
  11. Backup and recovery.
  12. Ensuring data integrity.
  13. Procedural skills.
  14. Extensible data type administration.
  15. Data security.
  16. Database auditing.
  17. General systems management and networking skills.
  18. Business knowledge.
  19. Data archiving.
  20. Enterprise resource planning (ERP).
  21. Web-specific technology expertise.
  22. Storage management techniques.

You can read the entire article, part 1 and part 2, to get all the details.

Scripts for listing all SQL Server Databases and Objects using PowerShell

PowerShell and SQL Server - SeniorDBA

This powerful script lists all objects in an instance and scripts them into a network folder, by date and instance, so you can keep a record of the objects.

This article by Angel Gomez gives you the script and some information on how to use it.

Using PowerShell and SQL Server Agent we can create a scheduled job that runs each day and produces scripts for all objects in all databases for an instance of SQL Server and that is what this tip does.

Here is the PowerShell code to generate a script for each object in the database.  The below code will script out table definitions, stored procedures, views, user defined functions and triggers.  This will generate scripts for every database in the SQL Server instance.

You need to supply the SQL Server name and the path where the objects are to be created.

$date_ = (date -f yyyyMMdd)
$ServerName = "." #If you have a named instance, you should put the name. 
$path = "c:\SQL_Server\Backup\Objects\"+"$date_"
 
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO')
$serverInstance = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $ServerName
$IncludeTypes = @("Tables","StoredProcedures","Views","UserDefinedFunctions", "Triggers") #object you want do backup. 
$ExcludeSchemas = @("sys","Information_Schema")
$so = new-object ('Microsoft.SqlServer.Management.Smo.ScriptingOptions')

 
$dbs=$serverInstance.Databases #you can change this variable for a query for filter yours databases.
foreach ($db in $dbs)
{
       $dbname = "$db".replace("[","").replace("]","")
       $dbpath = "$path"+ "\"+"$dbname" + "\"
    if ( !(Test-Path $dbpath))
           {$null=new-item -type directory -name "$dbname"-path "$path"}
 
       foreach ($Type in $IncludeTypes)
       {
              $objpath = "$dbpath" + "$Type" + "\"
         if ( !(Test-Path $objpath))
           {$null=new-item -type directory -name "$Type"-path "$dbpath"}
              foreach ($objs in $db.$Type)
              {
                     If ($ExcludeSchemas -notcontains $objs.Schema ) 
                      {
                           $ObjName = "$objs".replace("[","").replace("]","")                  
                           $OutFile = "$objpath" + "$ObjName" + ".sql"
                           $objs.Script($so)+"GO" | out-File $OutFile
                      }
              }
       }     
}

You can read the entire article here.

Making a Kali Bootable USB Drive on your Mac

Kali Linux - SeniorDA

Many people want to run a new version of Linux without the need for a new computer. The easiest way, and probably the fastest, is running Kali Linux (this actually works the same way with most distributions) is to run it from a USB drive without installing it to your internal hard drive.

This simple method has several advantages:

  • It’s fast – Once you have the distribution installed on a bootable USB drive, you can boot to the login screen in just a few seconds, vs. installing and configuring the files on your internal hard drive.
  • It’s reversible — since this method doesn’t change any of your files on your internal drive or installed OS, you simply remove the Kali USB drive and reboot the system to get back to your original OS.
  • It’s portable — you can carry the Linux USB with you at all times so you can use it on most systems in just a few seconds.
  • It’s optionally persistent — you can decide to configure your Kali Linux USB drive to have persistent storage, so your data and configuration changes are saved across reboots

In order to do this, we first need to create a bootable USB drive which has been set up from an ISO image of Kali Linux.

What You’ll Need

  1. A verified copy of the appropriate ISO image of the latest Kali build image for the target system. You’ll probably select the 64-bit version in most cases.
  2. In OS X, you will use the dd command, which is already pre-installed on your Mac.
  3. A 4GB or larger USB thumb drive.

Creating a Bootable Kali USB Drive on OS X

OS X is based on UNIX, so creating a bootable Kali Linux USB drive in an OS X environment is similar to doing it on Linux. Once you’ve downloaded and verified your chosen Kali ISO file, you use dd to copy it over to your USB stick.

WARNING: You can overwrite your internal hard drive if you do this wrong. Although this process is very easy, you should be very careful to follow the instructions.
  1. Without the target USB drive plugged into your system, open a Terminal window, and type the command diskutil list at the command prompt.
  2. This will display the device paths (look for the part that reads /dev/disk0, /dev/disk1, etc.) of the disks mounted on your system, along with information on the partitions on each of the disks.
    Kali Linux - SeniorDBA
  3. Plug in your USB device to your Mac in any open USB port, wait a few seconds, and run the command diskutil list a second time. Your USB drive will now appear in the listing and the path will most likely be the last one shown. In any case, it will be one which wasn’t present before. In this example, you can see that there is now a /dev/disk6 which wasn’t previously present.Kali Linux - SeniorDBA
  4. Unmount the drive (assuming, for this example, the USB stick is /dev/disk6 — do not simply copy this, verify the correct path on your own system!):
diskutil unmount /dev/disk6
  1. Proceed to (carefully!) image the Kali ISO file on the USB device. The following command assumes that your USB drive is on the path /dev/disk6, and you’re in the same directory with your Kali Linux ISO, which is named “kali-linux-2016.2-amd64.iso”:
sudo dd if=kali-linux-2016.2-amd64.iso of=/dev/disk6 bs=1m

Note: Increasing the blocksize (bs) will speed up the write progress, but will also increase the chances of creating a bad USB stick. 

Imaging the USB drive can take a good amount of time, over half an hour is not unusual, as the sample output below shows. Be patient and wait for the command to finish.

The dd command provides no feedback until it’s completed, but if your drive has an access indicator, you’ll probably see it flickering from time to time. The time to dd the image across will depend on the speed of the system used, USB drive itself, and USB port it’s inserted into. Once dd has finished imaging the drive, it will output something that looks like this:

2911+1 records in
2911+1 records out
3053371392 bytes transferred in 2151.132182 secs (1419425 bytes/sec)

And that’s it! You can now boot into a Kali Live Installer environment using the USB device.

To boot from an alternate drive on an OS X system, bring up the boot menu by pressing the Option key immediately after powering on the device and select the drive you want to use.

Kali Linux - SeniorDBA

Good Luck!

netdata: Linux Real Time Performance Monitoring

netdata - SeniorDBA

A netdata should be installed on each of your Linux servers. It is the equivalent of a monitoring agent, as provided by all other monitoring solutions. netdata is a highly optimized Linux daemon providing real-time performance monitoring for Linux systems, Applications, SNMP devices, over the web!  It is useful to visualize the insights of what is happening right now on your systems and applications.

netdata- SeniorDBA

This is what you get:

  • Stunning interactive bootstrap dashboards
    mouse and touch friendly, in 2 themes: dark, light
  • Amazingly fast
    responds to all queries in less than 0.5 ms per metric, even on low-end hardware
  • Highly efficient
    collects thousands of metrics per server per second, with just 1% CPU utilization of a single core, a few MB or RAM and no disk I/O at all
  • Sophisticated alarming
    supports dynamic thresholds, hysteresis, alarm templates, multiple role-based notification methods (such as email, slack.com, pushover.net, pushbullet.com telegram.org, twilio.com)
  • Extensible
    you can monitor anything you can get a metric for, using its Plugin API (anything can be a netdata plugin, BASH, python, perl, node.js, java, Go, ruby, etc)
  • Embeddable
    it can run anywhere a Linux kernel runs (even IoT) and its charts can be embedded on your web pages too
  • Customizable
    custom dashboards can be built using simple HTML (no javascript necessary)
  • Zero configuration
    auto-detects everything, it can collect up to 5000 metrics per server out of the box
  • Zero dependencies
    it is even its own web server, for its static web files and its web API
  • Zero maintenance
    you just run it, it does the rest
  • scales to infinity
    requiring minimal central resources
  • back-ends supported
    can archive its metrics on graphite or opentsdb, in the same or lower detail (lower: to prevent it from congesting these servers due to the amount of data collected)

This is what it currently monitors (most with zero configuration):

  • CPU
    usage, interrupts, softirqs, frequency, total and per core
  • Memory
    RAM, swap and kernel memory usage, including KSM the kernel memory deduper
  • Disks
    per disk: I/O, operations, backlog, utilization, space
  • Network interfaces
    per interface: bandwidth, packets, errors, drops
  • IPv4 networking
    bandwidth, packets, errors, fragments, tcp: connections, packets, errors, handshake, udp: packets, errors, broadcast: bandwidth, packets, multicast: bandwidth, packets
  • IPv6 networking
    bandwidth, packets, errors, fragments, ECT, udp: packets, errors, udplite: packets, errors, broadcast: bandwidth, multicast: bandwidth, packets, icmp: messages, errors, echos, router, neighbor, MLDv2, group membership, break down by type
  • Interprocess Communication – IPC
    such as semaphores and semaphores arrays
  • netfilter / iptables Linux firewall
    connections, connection tracker events, errors
  • Linux DDoS protection
    SYNPROXY metrics
  • fping latencies
    for any number of hosts, showing latency, packets and packet loss
  • Processes
    running, blocked, forks, active
  • Entropy
    random numbers pool, using in cryptography
  • NFS file servers and clients
    NFS v2, v3, v4: I/O, cache, read ahead, RPC calls
  • Network QoS
    the only tool that visualizes network tc classes in realtime
  • Linux Control Groups
    containers: systemd, lxc, docker
  • Applications
    by grouping the process tree and reporting CPU, memory, disk reads, disk writes, swap, threads, pipes, sockets – per group
  • Users and User Groups resource usage
    by summarizing the process tree per user and group, reporting: CPU, memory, disk reads, disk writes, swap, threads, pipes, sockets
  • Apache and lighttpd web servers
    mod-status (v2.2, v2.4) and cache log statistics, for multiple servers
  • Nginx web servers
    stub-status, for multiple servers
  • Tomcat
    accesses, threads, free memory, volume
  • mySQL databases
    multiple servers, each showing: bandwidth, queries/s, handlers, locks, issues, tmp operations, connections, binlog metrics, threads, innodb metrics, and more
  • Postgres databases
    multiple servers, each showing: per database statistics (connections, tuples read – written – returned, transactions, locks), backend processes, indexes, tables, write ahead, background writer and more
  • Redis databases
    multiple servers, each showing: operations, hit rate, memory, keys, clients, slaves
  • memcached databases
    multiple servers, each showing: bandwidth, connections, items
  • ISC Bind name servers
    multiple servers, each showing: clients, requests, queries, updates, failures and several per view metrics
  • Postfix email servers
    message queue (entries, size)
  • exim email servers
    message queue (emails queued)
  • Dovecot POP3/IMAP servers
  • IPFS
    bandwidth, peers
  • Squid proxy servers
    multiple servers, each showing: clients bandwidth and requests, servers bandwidth and requests
  • Hardware sensors
    temperature, voltage, fans, power, humidity
  • NUT and APC UPSes
    load, charge, battery voltage, temperature, utility metrics, output metrics
  • PHP-FPM
    multiple instances, each reporting connections, requests, performance
  • hddtemp
    disk temperatures
  • SNMP devices
    can be monitored too (although you will need to configure these)

And you can extend it, by writing plugins that collect data from any source, using any computer language.

Tiny Core Linux

Linux - SeniorDBA

If you are looking for a small Linux distribution that can get you a desktop with a 10MB ISO you should look at this Tiny Core system. Tiny Core offers a very fast experience overall, with a very rapid boot time that no other major distribution can match.

Tiny Core uses its own package format, but the package repository is huge with thousands of applications. When you install applications they are downloaded and then added on the fly, automatically added to the application bar.

The Core Project, as suggested by our name, is not a turnkey desktop distribution. Instead we deliver just the core Linux from which it is quite easy to add what you want. We offer 3 different x86 “cores” to get you started: Core, TinyCore, and our installation image, CorePlus.
Core
(11 MB)
Core is the base system which provides only a command line interface and is therefore recommended for experienced users only. Command line tools are provided so that extensions can be added to create a system with a graphical desktop environment. Ideal for servers, appliances, and custom desktops.
TinyCore
(16 MB)
TinyCore is the recommended option for new users who have a wired network connection. It includes the base Core system plus X/GUI extensions for a dynamic FLTK/FLWM graphical desktop environment.
CorePlus
(106 MB)
CorePlus is an installation image and not the distribution. It is recommended for new users who only have access to a wireless network or who use a non-US keyboard layout. It includes the base Core System and installation tools to provide for the setup with the following options: Choice of 7 Window Managers, Wireless support via many firmware files and ndiswrapper, non-US keyboard support, and a remastering tool.