Drupal 8 notes – Transfer site to local system for development

These are procedures and notes to download source code and database content for development use on a local workstation server. The goal is create a mirror of the Drupal based website for local development and testing.


You should have a local or alternate development system ready. This systems will require a web server running PHP and a database compatible with MySQL. Typically this system should be similar to the system hosting the server you wish to make a copy of for development purposes.

Other tools used on the local system include:

  • PHP-Composer – installed globally
  • Node / npm – installed globally
  • Drush – installed globally
  • Git – installed globally

Transfer Procedures

  1. Export configuration sync package from remote server.
    A. Visit Configurations > Configuration Synchronization.
    B. Visit Export tab.
    C. Download Export file and keep for local set up.
  2. Download Git Repo
  3. Perform composer update:
    composer update drupal/core-recommended –with-dependencies
  4. Download Drupal Database
  5. Install Database for git website. May require your own settings.php file in /sites/default
  6. !!! – Website should load at this point.
  7. !!! Deactivate secure login to stop https redirects:
    drush pm-uninstall securelogin
  8. Log in.
  9. Visit Configurations > Configuration Synchronization. There may be an error for a missing directory. Make sure you create that missing directory path.
  10. Visit Import tab.
  11. Upload file saved in step #1 > C.
  12. Follow upload with “Install All” on Synchronize Import screen.
  13. It’s always a good idea to clear the Drupal Cache: Either with Drush if working:
    drush cache:rebuild
    or by visiting options under Configuration > Performance.
  14. They may be fine, but you may also check database updates with Drush:
    drush updatedb

Drupal 8 content type, view, block, twig notes

Large general tutorial video making a directory with content types and views:

PathAuto module information. For helping to make more readable URLs

Using Entity References in Views – Great overview of using entity references.

Drupal 8 theme twig template file reference. Includes various naming conventions for pages, views, blocks, nodes, etc.


NPM and Yeoman update/install notes for MacOS

Had to deal with some issues introduced by bad homebrew installations and possible old NPM installs. My guess is that I probably cheated/screwed up using a sudo install when I should have done something different.

This all started playing with the WebDevStudions WordPress plugin generator for yeoman.

Get a Plugin Kickstart with Yeoman & generator-plugin-wp!

This gist was very helpful – Fixing npm On Mac OS X for Homebrew Users

READ THIS Very Important Update

This entire guide is based on an old version of Homebrew/Node and no longer applies. It was only ever intended to fix a specific error message which has since been fixed. I’ve kept it here for historical purposes, but it should no longer be used. Homebrew maintainers have fixed things and the options mentioned don’t exist and won’t work.

I still believe it is better to manually install npm separately since having a generic package manager maintain another package manager is a bad idea, but the instructions below don’t explain how to do that.

Fixing npm On Mac OS X for Homebrew Users

Installing node through Homebrew can cause problems with npm for globally installed packages. To fix it quickly, use the solution below. An explanation is also included at the end of this document.


This solution fixes the error caused by trying to run npm update npm -g. Once you’re finished, you also won’t need to use sudo to install npm modules globally.

Before you start, make a note of any globally installed npm packages. These instructions will have you remove all of those packages. After you’re finished you’ll need to re-install them.

Run the following commands to remove all existing global npm modules, uninstall node & npm, re-install node with the correct defaults, configure the location for global npm modules to be installed, and then install npm as its own pacakge.

rm -rf /usr/local/lib/node_modules
brew uninstall node
brew install node --without-npm
echo prefix=~/.npm-packages >> ~/.npmrc
curl -L https://www.npmjs.com/install.sh | sh

Node and npm should be correctly installed at this point. The final step is to add ~/.npm-packages/bin to your PATH so npm and global npm packages are usable. To do this, add the following line to your ~/.bash_profile:

export PATH="$HOME/.npm-packages/bin:$PATH"

Now you can re-install any global npm packages you need without any problems.

Explanation of the issue

If you’re a Homebrew user and you installed node via Homebrew, there is a major philosophical issue with the way Homebrew and NPM work together. If you install node with Homebrew and then try to do npm update npm -g, you may see an error like this:

$ npm update npm -g
npm http GET https://registry.npmjs.org/npm
npm http 304 https://registry.npmjs.org/npm
npm http GET https://registry.npmjs.org/npm/1.4.4
npm http 304 https://registry.npmjs.org/npm/1.4.4
npm ERR! error rolling back Error: Refusing to delete: /usr/local/bin/npm not in /usr/local/lib/node_modules/npm
npm ERR! error rolling back     at clobberFail (/usr/local/Cellar/node/0.10.26/lib/node_modules/npm/lib/utils/gently-rm.js:57:12)
npm ERR! error rolling back     at next (/usr/local/Cellar/node/0.10.26/lib/node_modules/npm/lib/utils/gently-rm.js:43:14)
npm ERR! error rolling back     at /usr/local/Cellar/node/0.10.26/lib/node_modules/npm/lib/utils/gently-rm.js:52:12
npm ERR! error rolling back     at Object.oncomplete (fs.js:107:15)
npm ERR! error rolling back  npm@1.4.4 { [Error: Refusing to delete: /usr/local/bin/npm not in /usr/local/lib/node_modules/npm] code: 'EEXIST', path: '/usr/local/bin/npm' }
npm ERR! Refusing to delete: /usr/local/bin/npm not in /usr/local/lib/node_modules/npm
File exists: /usr/local/bin/npm
Move it away, and try again. 

npm ERR! System Darwin 13.1.0
npm ERR! command "/usr/local/Cellar/node/0.10.26/bin/node" "/usr/local/bin/npm" "update" "npm" "-g"
npm ERR! cwd /Users/dan/Google Drive/Projects/dotfiles
npm ERR! node -v v0.10.26
npm ERR! npm -v 1.4.3
npm ERR! path /usr/local/bin/npm
npm ERR! code EEXIST
npm ERR! 
npm ERR! Additional logging details can be found in:
npm ERR!     /Users/dan/Google Drive/Projects/dotfiles/npm-debug.log
npm ERR! not ok code 0

There’s an NPM bug for this exact problem. The bug has been “fixed” by Homebrew installing npm in a way that allows it to manage itself once the install is complete. However, this is error-prone and still seems to cause problems for some people. The root of the the issue is really that npm is its own package manager and it is therefore better to have npm manage itself and its packages completely on its own instead of letting Homebrew do it.

Also, using the Homebrew installation of npm will require you to use sudo when installing global packages. Since one of the core ideas behind Homebrew is that apps can be installed without giving them root access, this is a bad idea.

view raw
hosted with ❤ by GitHub

Another strange issue was caused by installing the generator-plugin-wp yoeman generator on a broken yeoman/npm system. This posted github issue helped. https://github.com/npm/npm/issues/10995 This needed a straightforward uninstall, cache clear, and reinstall. I may have reintialized my terminal sessions to drop out of sudo mode. It looked something like:

$ sudo npm remove -g yo generator-plugin-wp
$ npm cache clean
— can’t remember if I relaunched terminal to drop out of sudo mode.
$ npm install -g generator-plugin-wp

I’ll be testing this yeoman generator on the side for a while: https://github.com/WebDevStudios/generator-plugin-wp

Spying on a directory with auditd

Files start coming up missing for me on a server and I get freaked out looking for security holes, but sometimes users and other utilities are spiking the bunch bowl. You can get serious with watching files with other utilities, but I went back to good ole auditd.

A simple test to track stuff getting trashed from an upload folder:

[code]auditctl -w /site-dir/wp-content/uploads/ -p wa -k upload_issue[/code]

A capital W will remove the rule:

[code]auditctl -W /site-dir/wp-content/uploads/ -p wa -k upload_issue[/code]

Do a quick search for issues with ausearch.

[code]ausearch -f wp-content/uploads[/code]

Now permanently add the rule on a redhat system by putting this line in /etc/audit/audit.rules. Just leave off the auditctl command.

[code] -w /site-dir/wp-content/uploads/ -p wa -k upload_issue[/code]

Of course you need to make sure your auditd process is running and using chkconfig, etc. Good ole check status like:

[code]/etc/init.d/auditd status[/code]

Here are a few of the resources I used:

Please forgive the RedHat auth-walls…


My Favorite Httrack commands

HTTrack is a website mirroring utility that can swamp your disks with mirror copies of the internet. I’ve had to use it several times to make off-line copies of websites for all sorts of weird reasons. You’ll find HTTrack at: www.httrack.com. You can get a full list of command line options at: https://www.httrack.com/html/fcguide.html. There is a spiffy web and Windows wizard interface for HTTrack, but I gave that up.

This is the recipe for the command line options I’ve been using to produce a browse-able offline version of accreditation documents. This command says “Make an offline mirror of these URLs, go up to 8 links deep on these sites and 2 links deep on other domains. Stay on the TLD (.edu) and do it as quickly as possible. Be warned as it currently stands this will fill up about 1.5GB of disk space ;P.

[code]httrack http://www.nicholls.edu/sacscoc-2016/ http://www.nicholls.edu/catalog/2014-2015/html/ http://www.nicholls.edu/about/ -O /Users/nichweb/web-test -r8 -%e1 -%c16 -*c16 -B -l -%P -A200000[/code]

The great part is that the archive grows as URLs are added.

Apache log one-liners using tail, awk, sort, etc.

Good bunch of samples with other examples found at: https://blog.nexcess.net/2011/01/21/one-liners-for-apache-log-files/

# top 20 URLs from the last 5000 hits
tail -5000 ./transfer.log | awk ‘{print $7}’ | sort | uniq -c | sort -rn | head -20
tail -5000 ./transfer.log | awk ‘{freq[$7]++} END {for (x in freq) {print freq[x], x}}’ | sort -rn | head -20

# top 20 URLS excluding POST data from the last 5000 hits
tail -5000 ./transfer.log | awk -F"[ ?]" ‘{print $7}’ | sort | uniq -c | sort -rn | head -20
tail -5000 ./transfer.log | awk -F"[ ?]" ‘{freq[$7]++} END {for (x in freq) {print freq[x], x}}’ | sort -rn | head -20

# top 20 IPs from the last 5000 hits
tail -5000 ./transfer.log | awk ‘{print $1}’ | sort | uniq -c | sort -rn | head -20
tail -5000 ./transfer.log | awk ‘{freq[$1]++} END {for (x in freq) {print freq[x], x}}’ | sort -rn | head -20

# top 20 URLs requested from a certain ip from the last 5000 hits
IP=; tail -5000 ./transfer.log | grep $IP | awk ‘{print $7}’ | sort | uniq -c | sort -rn | head -20
IP=; tail -5000 ./transfer.log | awk -v ip=$IP ‘ $1 ~ ip {freq[$7]++} END {for (x in freq) {print freq[x], x}}’ | sort -rn | head -20

# top 20 URLS requested from a certain ip excluding, excluding POST data, from the last 5000 hits
IP=; tail -5000 ./transfer.log | fgrep $IP | awk -F "[ ?]" ‘{print $7}’ | sort | uniq -c | sort -rn | head -20
IP=; tail -5000 ./transfer.log | awk -F"[ ?]" -v ip=$IP ‘ $1 ~ ip {freq[$7]++} END {for (x in freq) {print freq[x], x}}’ | sort -rn | head -20

# top 20 referrers from the last 5000 hits
tail -5000 ./transfer.log | awk ‘{print $11}’ | tr -d ‘"’ | sort | uniq -c | sort -rn | head -20
tail -5000 ./transfer.log | awk ‘{freq[$11]++} END {for (x in freq) {print freq[x], x}}’ | tr -d ‘"’ | sort -rn | head -20

# top 20 user agents from the last 5000 hits
tail -5000 ./transfer.log | cut -d -f12- | sort | uniq -c | sort -rn | head -20

# sum of data (in MB) transferred in the last 5000 hits
tail -5000 ./transfer.log | awk ‘{sum+=$10} END {print sum/1048576}’

Using HyperDB to separate and share user and user_meta between WordPress installations

I need to remember to keep this example for some testing. This should be a good start for sharing a user and user_meta between websites. I do know that user_meta tends to have very site-centric settings at times. Original article was located at: http://wordpress.aspcode.net/view/63538464303732726666099/how-to-use-hyperdb-to-separate-and-share-a-user-dataset-between-wordpress-installs

$wpdb->add_database(array( //Connect to Users Database
‘host’ => DB_HOST, // I am using the same host for my two DBs
‘user’ => DB_USER, // I am using the same username for my two DBs
‘password’ => DB_PASSWORD, // I am using the same p/w for my two DBs
‘name’ => ‘my_user_db_name’,
‘write’ => 0, // Change to 1 if you want your slave site’s the power to update user data.
‘read’ => 1,
‘dataset’ => ‘user’,
‘timeout’ => 0.2,

$wpdb->add_database(array( // Main Database
‘host’ => DB_HOST,
‘user’ => DB_USER,
‘password’ => DB_PASSWORD,
‘name’ => DB_NAME,

function user_callback($query, $wpdb) {
if ( $wpdb->base_prefix . ‘users’ == $wpdb->table || $wpdb->base_prefix . ‘user_meta’ == $wpdb->table) {
return ‘user’;