I recently updated my workflow, and I wanted to post it here. The last post I had didn't get too much attention, which is fine, but I *really* think this stuff is useful, especially for developers with many clients / servers.
Previously, I talked about Git, and how I integrated that within my workflow. I'm still doing that, as I think it's incredibly important. Both for maintaining code revisions and documentation, and for maintaining various branches of the code.
What I have done now, is integrated XML, XSLT and some more bash scripting into my operation.
My build environment is Windows 7 with Git/Cygwin installed. This will work from a *nix based OS as well (Git, and the programs are actually originally from Linux).
I use FileZilla for my FTP program, as well as cURL for uploading individual files.
The workflow is pretty much agnostic of the development environment, so things like Notepad++, TextEdit, TextMate, TextPad, Vi, Vim, emacs, etc. are all fine to use. Some have Git integration, while others don't.
I installed a program called xsltproc to process my XSL template.
This page (http://www.sagehill.net/docbookxsl/I...Processor.html) contains the necessary information to get xsltproc up and running on a Windows system.
For xsltproc I put the necessary .dll files in the BIN folder of my mivascript compiler. This was just because I knew it was in the Windows path environment variable already, and I didn't have to worry about updating things (Keep It Simple, Sir).
For FileZilla I use the Site Manager, which I'm assuming most of you guys do too, if you use FileZilla. FileZilla has an XML file called "sitemanager.xml" which contains all of the username and password information for your FTP servers. What I wanted to do, was write a simple script which would allow me to pull the information from sitemanager.xml, and allow me to upload a file to a specific client's site, without much trouble.
So, I wrote an XSLT file, updated my upload.sh file and modified how the ftp.dat file worked.
Here are the working parts of the system:
FileZilla Site Manager --When I have my client's sites in the Site Manager, I use the "Comments" section to tell me where the Miva Merchant installation is located. If it is in "/httpdocs/mm5", then I leave it blank, as that is the default location. But if it's in "/devel/mm5", then I'll put that (and only that) in the comments section of the Site Manager.
ftp.dat --This file is located in the .git folder of the current repo that I'm working in.
It contains a single variable now: END_PATH_LOCATION
END_PATH_LOCATION is the location on the server where the file is located. Now, let me be a little more specific -- it's the location relative to the Miva Merchant installation on the server where the file is located. Why is that important? Because some of the installations are Merchant5, mm5, Merchant2, merchant2, or something completely different. But we don't want to have to remember that much, we just want to be able to upload and be done.
So, here's an example of a module which resides in the util directory.
Here's what the file should look like:
So, if my directory structure is like this:
# For END_PATH_LOCATION, remember, we DON'T include the trailing slash!!!!
# END_PATH_LOCATION is the location on the server that we're going to be uploading to
My ftp.dat file would be located at: module_name/.git/ftp.dat
The upload.sh script is the next part. This is simple too. It has two forms of usage: 1 parameter, and 2 parameters.
The 1 parameter usage is for when you want to upload to a default server. Say you have a development server that you're using for instance. This will call the xsltproc program, and get the proper cURL parameters and execute the result.
So, the first syntax would be something like:
You simply tell it the name of the module you want to upload (that's in the same directory as you are currently in), and it'll upload it to a pre-defined default site.
The 2 parameter usage is when you upload a specific file to a specific server.
That looks like this:
Where "Foo Bar Dev Site" is what you NAMED the server in your Site Manager in FileZilla.
upload.sh example_module.mvc "Foo Bar Dev Site"
The upload.sh file has two variables:
XSLT_LOCATION - This is where the xsl file is located (we'll get to this next)
SITE_MANAGER_LOCATION -This is where your sitemanager.xml file is located.
Here's what that file looks like:
# $1 - File Name
# $2 - Site Name (As you've named it in your SiteManager)
# $END_PATH_LOCATION - The location of the upload
# Look into possibly getting the cwd, and uploading to that from a 'base'
if [ $# == 1 ] ; then
xsltproc --stringparam file_name "$1" --stringparam end_path_location "$END_PATH_LOCATION" $XSLT_LOCATION $SITE_MANAGER_LOCATION | sh
elif [ $# == 2 ] ; then
xsltproc --stringparam file_name "$1" --stringparam site_name "$2" --stringparam end_path_location "$END_PATH_LOCATION" $XSLT_LOCATION $SITE_MANAGER_LOCATION | sh
echo "Usage (1) is: upload.sh FILE_NAME SITE_NAME"
echo "Usage (2) is: upload.sh FILE_NAME (this will upload to the default directory)"
The next part is the XSLT file: select_server.xsl
What I need to do, is make sure that the location that I save this file to, is the same location that's defined in my upload.sh script for SITE_MANAGER_LOCATION.
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
<xsl:when test="Comments !=''">
curl -T <xsl:value-of select="$file_name" /> ftp://<xsl:value-of select="Host"/>/<xsl:value-of select="Comments"/>/<xsl:value-of select="$end_path_location" />/ --user <xsl:value-of select="User"/>:<xsl:value-of select="Pass"/>
curl -T <xsl:value-of select="$file_name" /> ftp://<xsl:value-of select="Host"/>/httpdocs/mm5/<xsl:value-of select="$end_path_location" />/ --user <xsl:value-of select="User"/>:<xsl:value-of select="Pass"/>
When I have all of that set up (which takes only a couple seconds), I can now easily query the sitemanager.xml file, pull the current username and passwords for a client's FTP server, and go about my way.
This is what I've been wanting to do for a while now, and I'm glad that I did it. It has given me more time, and allowed for a more flexible environment to develop from.
What are some things I can improve on?
If I set my sitemanager.xml location on my NAS, then I can use the script from any computer on my network without having to replicate the same data on each computer. Makes working from different computer's easier (especially if you're working in the evening while watching Doctor Who with your son).
Better Awareness of Folders
One of the other things I want to do, is figure out how to get the script to look up the directory structure, back to the "parent", to find out where the .git folder is located.
If I can do that, then I can get it set up to do an "upload.sh" from anywhere within a specific repo, get the root location, and upload the file to the same location on the directory of the server.
Currently, the implementation is limited to files that have .git/ftp.dat as a child-folder of their parent folder.
So, if I have a structure like:
And I wanted to upload common.js, I would have to either do it from the /example_module/ directory, and specify it relatively, or create a .git/ftp.dat file in that folder.
I would seriously love to hear some feedback and critiques on this. I know my bash scripts probably look like they were written by an 8th grader, but I don't write them all that often. And my XSLT template is probably hacked together too, but it was my first time writing with that as well.
Again.. let me know! I don't know if anyone else is interested in this stuff, but I wanted to share it.