Magento 2’s Customer Attribute (And Address) Bizarreness

Adding customer attributes to be editable only for admin accounts, you would think would be a straightforward process. Well let me tell you that Magento 2 has other ideas!

Firstly, adding to forms. You’ll notice that just adding attributes to either adminhtml_customer, or adminhtml_customer_address does nothing in the admin area. For some ungodly reason, to get them to show, attributes must also be added to the customer_account_edit and customer_address_edit forms.

Great! attributes now appear in the admin area, but woe unto thee that thinks we’re done here!

Saving from the Admin Area

Hahaha, expecting them to save… just like that? Fool. Make sure you’ve also set the system key of your attribute to false, otherwise Magento will just ignore them.

They’ll need adding to an attribute set and group for this to work correctly; typically this is just the default of each for the entity type.

So now we should be done… right? RIGHT?! Well, young urchin, try updating a customer account information or address from the frontend of your store. What’s that? Hahaha, yes, Magento just obliterated your custom customer attribute values. Serves you right for being so optimistic.

Now, you’ll notice, that even though we’ve added the attributes to the customer_account_edit and customer_address_edit forms to get them to show in the admin area, which foolishly in my mind should have no effect there, that do not show in the frontend.

No, if you want them to appear for customers, you’ll have to do more than add the attribute to a form (guffaws). But, for this example, we don’t want them to show in the frontend and certainly don’t want the data to be sent into the flaming abyss by a customer’s saving of them.

The fix, in true Magento style, is to set the visible key to false when creating the attribute, which persists to the customer_eav_attribute‘s is_visible column. Yes… visible prevents Magento from trying to save the data from the frontend. Cue manic laughter from the Magento Devs as they revel in our pain.

Recap

customer_account_edit and customer_address_edit forms – shows the field in the admin area.
is_system – Set to false for Magento to save the value from the admin area.
Add to attribute set and group – Also required to save from the admin area.
visible – Prevents Magento from trying to save the values from the frontend’s customer account section.

I’m not 100% sure what the adminhtml_customer and adminhtml_customer_address forms are actually used for anymore, but I’ll keep them for a laugh, just in case the Magento devs change their minds.

A code snippet:

protected function createCustomerAttributes(
    \Magento\Eav\Setup\EavSetup $eavSetup,
    ModuleDataSetupInterface $setup
) {
    $attributes = [];

    $customerSetup = $this->customerSetupFactory->create(['setup' => $setup]);
    $customerEntity = $customerSetup->getEavConfig()->getEntityType(\Magento\Customer\Model\Customer::ENTITY);
    $attributeSetId = $customerEntity->getDefaultAttributeSetId();
    $attributeSet = $this->attributeSetFactory->create();
    $attributeGroupId = $attributeSet->getDefaultGroupId($attributeSetId);

    // ... Add other attributes here 

    $attributes['attribute_code'] = [
        'type'     => 'varchar',
        'label'    => 'Attribute label',
        'input'    => 'text',
        'visible'  => false,
        'system' => false,
        'required' => false,
        'user_defined' => true,
        'position' => 290
    ];

    foreach ($attributes as $code => $attribute) {

        $eavSetup->addAttribute(
            \Magento\Customer\Model\Customer::ENTITY,
            $code,
            $attribute
        );

        $createdAttribute = $this->eavConfig->getAttribute(
            \Magento\Customer\Model\Customer::ENTITY,
            $code
        );

        $createdAttribute->setData('used_in_forms', ['customer_account_edit', 'adminhtml_customer'])
            ->setData('attribute_set_id', $attributeSetId)
            ->setData('attribute_group_id', $attributeGroupId)
            ->save();
    }
}

Come on Magento 2, we need more hoops to jump through than that! It’s like it’s not even trying to be frustrating some days.

Create Attributes In Magento 2

Creating attributes in Magento 2 is very similar to Magento 1. Magento 2, for the most part is smart enough to ignore the creation of attributes which already exist, and will instead update them. This is all well and good, unless your attributes have options in them. In which case, you will need to check whether the attribute exists already or Magento will duplicate option values on subsequent runs of your installer or updater.

The fix for this is to check whether the attribute already exists before attempting to create it, however this causes another issue. Magento’s EAVConfig class caches attributes after a the getAttribute method is called – so this cache needs cleaning if we need to set other data on the attribute after creation, such as the forms it needs to exist in.

The following is an example to create a customer address attribute, taken from an installer class where $this->eavConfig is an instance of \Magento\Eav\Model\Config:

protected function createCustomerAddressAttributes(
    \Magento\Eav\Setup\EavSetup $eavSetup,
    ModuleDataSetupInterface $setup
) {

    $customerAddressSetup = $this->customerSetupFactory->create(['setup' => $setup]);
    $customerAddressEntity = $customerAddressSetup->getEavConfig()->getEntityType(
        \Magento\Customer\Api\AddressMetadataInterface::ENTITY_TYPE_ADDRESS
    );
    $attributeSetId = $customerAddressEntity->getDefaultAttributeSetId();
    $attributeSet = $this->attributeSetFactory->create();
    $attributeGroupId = $attributeSet->getDefaultGroupId($attributeSetId);

    $attributes = [];

    $attributes['attribute_code'] = [
        'type'     => 'text',
        'label'    => 'Attribute Label',
        'input'    => 'textarea',
        'visible'  => true,
        'required' => false,
        'user_defined' => true,
        'position' => 220
    ];

    foreach ($attributes as $code => $attribute) {
        $this->eavConfig->clear();
        // Don't create the attribute if it already exists
        $attributeCheck = $this->eavConfig->getAttribute(
            \Magento\Customer\Api\AddressMetadataInterface::ENTITY_TYPE_ADDRESS,
            $code
        );
        
        if ($attributeCheck->getAttributeId()) {
            continue;
        }

        // Stop Magento from loading the cached attribute with no Id.
        $this->eavConfig->clear();

        $attribute['system'] = 0;

        $eavSetup->addAttribute(
            \Magento\Customer\Api\AddressMetadataInterface::ENTITY_TYPE_ADDRESS,
            $code,
            $attribute
        );

        $createdAttribute = $this->eavConfig->getAttribute(
            \Magento\Customer\Api\AddressMetadataInterface::ENTITY_TYPE_ADDRESS,
            $code
        );

        $createdAttribute->setData('used_in_forms', ['adminhtml_customer_address', 'customer_address_edit'])
            ->setData('attribute_set_id', $attributeSetId)
            ->setData('attribute_group_id', $attributeGroupId)
            ->save();
    }

    return $this;        
}

If we didn’t have the $this->eavConfig->clear(); clear in there, the installer would use the cached version of the attribute when performing our initial $attributeCheck, and the final save would try and create the attribute again.

Fixing slow reindexes in Magento 2 & MariaDb

We recently noticed some discrepancies when indexing catalog_category_product on Mariadb when compared to Mysql 5.7. The indexer would take around 13 minutes on MariaDb, compared to around 3 seconds on MySql. To add more confusion to the mix, MariaDb was running on a powerful staging server, and MySQL running on a lowly MacBook Pro development machine.

After much head-scratching and career path questioning, the issue seemed to be related to the following statement in Magento\Catalog\Model\Indexer\Category\Product\Action\Full.php

$this->connection->query(
    $this->connection->insertFromSelect(
        $resultSelect,
        $this->tableMaintainer->getMainTmpTable((int)$store->getId()),
        $columns,
        AdapterInterface::INSERT_ON_DUPLICATE
    )
);

This is executed in the reindexCategoriesBySelect method, which creates a temporary table to work with when regenerating the index for a particular store. It turns out that MariaDb’s temporary table usage woefully bad when large amounts of data are being inserted. This appears to be related when aria_used_for_temp_tables is set to ON, a value which can only be changed by recompiling MariaDb. See here and here.

The fix, without switching database engines, or recompiling MariaDb, is to adjust Magento’s batchRowsCount to a lower number so that the database isn’t dealing with as many temporary table inserts at a time. Magento provide config settings for this value for all of their indexers which use temporary tables, so adjust this value for whichever indexer is giving slow performance. For our case, changing the default value from 100000 (the default) to 500 brought the indexer time down from 13 minutes to 6 seconds. The following, added to di.xml was the panacea for our case

<config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="urn:magento:framework:ObjectManager/etc/config.xsd">
    <type name="Magento\Catalog\Model\Indexer\Category\Product\Action\Full">
        <arguments>
            <argument name="batchRowsCount" xsi:type="number">500</argument>
        </arguments>
    </type>
</config>

Magento’s documentation on use of this value can be found here, which states

“You can experiment to determine the ideal batch size. In general, halving the batch size can decrease the indexer execution time”.

Yup, experiment indeed with this value, YMMV! Happy indexing!

Magento 2’s uiRegistry Debugging

On the frontend of Magento 2 components are constructed hierarchically. Their names are derived from a concatenation of their parent names. Looking at the checkout_index_index.xml, we can see that the checkout component contains a horrendous amount of config.

To make debugging of this gargantuan hellspawned rat’s nest easier, we can use the uiRegistry object in our browser’s console.

If we would like to get a particular object and we know the full concatenated name of the item, we can simply use something akin to the following;

requirejs('uiRegistry').get("checkout.steps.shipping-step.shippingAddress");

If however, we would like to get a uiComponent using a property name, we can instead use the get method as a query. In the example above, if we only knew the script location of the uiComponent in question, we could instead perform

requirejs('uiRegistry').get("component = Magento_Checkout/js/view/shipping");

We can also pass in a callback method as the second parameter, where the item(s) returned are passed in as parameters to that function.

Getting all registered uiComponents

The get method also allows us to pass in a callback function as the first parameter, instead of a query. This will pass all items sequentially through our callback, allowing us to see exactly what is registered;

requirejs('uiRegistry').get(function(item){
    console.log(item.name, item);
});

Do this on the checkout and prepare to have an exorcist level of data vomited into your poor, unsuspecting console.

Related Magento DevDoc

Stopping Magento 2 Redirecting from the Checkout

Sometimes, we want to add debug information when submitting Magento’s checkout to see what exactly is going on in the backend. Adding any debug information to the payload of Magento’s payment-information call will cause a redirect back to the cart page. This will render any information we’ve output as unobtainable even when using “Preserve Log” in Chrome Dev Tools. To stop this redirect, we can temporarily comment out the following in module-checkout/view/frontend/web/js/view/payment/default.js in the placeOrder method

if (self.redirectAfterPlaceOrder) {
    redirectOnSuccessAction.execute();
}

collectTotals in Magento 2

If we were to have a custom quote object which we were adding items to, we can re-calculate the totals in the cart using the following method.

$shippingAddress = $preorderQuote->getShippingAddress();
$shippingAddress->unsetData('cached_items_all');            
$quote->setTotalsCollectedFlag(false)->collectTotals();
$this->quoteRepository->save($quote);

We unset the cached item on the quote address object to force Magento to re-load the items for the address. This is necessary as Magento will sometimes not update this automatically, leading to an incorrect zero value quote and quote items.

Webpack – Assets

By default, Webpack only deals with javascript. For other filetypes, we need to add loaders to our config. We define these in the rules section of our config file, providing a regex for the test key for the files to be dealt with (typically file extension).

{
...
 rules: [
            {
                test: /\.css$/
                ...
            }
        ]
...
}

If we want to exclude files for any reason, we can add an exclude key, which is also a regex

 rules: [
            {
                test: /\.css$/,
                exclude: /file-not-to-be-processed.css/
                ...
            }
        ]  

CSS

Let’s add a CSS file to our project in src/css/style.css

body{
    background: red;
}

Now we need to include this file in our src/index.js file.

import './css/style.css';

In our webpack.config.js file, we can now add a definition for different filetypes which match on a filename regex:

const path = require('path');

module.exports = {
    entry: './src/index.js',
    output: {
        filename: 'bundle.js',
        path: path.resolve(__dirname, 'dist')
    },
    module: {
        rules: [
            {
                test: /\.css$/,
                use: [
                    'style-loader',
                    'css-loader'
                ]
            }
        ]
    }
};

And then install the css-loader and style-loader

npm install style-loader css-loader -D

When processing files, the use array is processed from top to bottom.
The CSS Loader loads CSS files, and deals with any assets within the file. Then the result is passed to the Style Loader which injects the CSS onto the page using a style tag.

In the example above, the the loaders are passed in order as strings in an array. If we need to configure the loaders with options, for example to enable source maps, we can use an object instead of a string, and provide an options key:

{
...
 rules: [
            {
                test: /\.css$/,
                use: [
                    {
                        loader: 'style-loader'
                    },
                    {
                        loader: 'css-loader',
                        options: {
                            sourceMap: true
                        }
                    }
                ]
            }
        ]
...
}

We will now be able to see files and line numbers when inspecting elements in the dev tools.

By default, the style-loader will create a separate style tag for each file being imported. To coalesce them into a single style tag, we can provide a singleton option key:

...
{
    loader: 'style-loader',
    options: {
        sourceMap: true,
        singleton: true
    }
}
...

Outputting CSS To A File

Instead of including all of the CSS in style tags, we can instead output them to a file. We’ll use the MiniCssExtractPlugin to achieve this.

npm install -D mini-css-extract-plugin

And then include it at the top of our config file

const MiniCssExtractPlugin = require('mini-css-extract-plugin');

We can then need to use this as a plugin which will output our file

const MiniCssExtractPlugin = require('mini-css-extract-plugin');

{
...
 plugins: [
        new MiniCssExtractPlugin({
            filename: "style.css"
        })
    ],
...
}

This also requires a loader to be added to the process chain. This plugin, will replace the style-loader which we used earlier.

{
...
    rules: [
            {
                test: /\.css$/,
                use: [
                    {
                        loader: MiniCssExtractPlugin.loader
                    },
                    { 
                        loader: "css-loader"
                    }  
                ]
            }
        ]
...
}

We can now link to dist/styles.css from within our HTML file

<link rel="stylesheet" href="dist/style.css">

Dynamically Including CSS

In the example above, we have to hardcode our output CSS file within our HTML file. This would be cumbersome if we were using a hashed filename which would re-generate dynamically. We can achieve dynamic importing using the HTML Webpack Plugin

npm install -D html-webpack-plugin

Now we can remove any hardcoded script or CSS tags from our index.html, and move it to the ./src folder. Next, we can add the plugin definition

{
...
 plugins: [
        ...
        new MiniCssExtractPlugin({
            filename: "style.css",
        }),
        new HTMLWebpackPlugin({
            filename: '../index.html',
            template: './src/index.html'
        })
    ],
...
}

File Hashing

By default, the index.html file will be built in our dist folder, we’ll place this one level above in this example. We are now able to introduce file hashing to our application, as the CSS files are being injected dynamically. This means that if any of our file contents change, such as adding a new CSS rule, the filename will change and the user will always request the newest version.

There are two ways to do this, we can either use the html-webpack-plugin to append a querystring hash to the end of our static filenames, or we can use the mini-css-extract-plugin to create a hashed filename for our CSS. We can also use a combination of the two.

{
...
 plugins: [
        new CleanWebpackPlugin(['dist']),
        new MiniCssExtractPlugin({
            filename: '[contenthash].css'
        }),
        new HTMLWebpackPlugin({
            filename: '../index.html',
            template: './src/index.html',
            hash: true
        })
    ],
...
}

The above example will do both, using [contenthash] as the filename and hash: true in the HTMLWeboackPlugin.

Taking this further, we can now use contenthash when including our javascript file in Webpack’s config, so instead of using just main.js, we can use

{
...
    output: {
        filename: '[name].[contenthash].js'
        ...
    }
...
}

contenthash should be used when we need to know if a particular piece of content changes. Using chunkhash or hash will regenerate when content outside of our context changes. E.g. If we change CSS, then the JS’s file hash will also change.

It should be noted, that using HTMLWebpackPlugin’s hash option uses webpack’s hash for the build, which will change when the entire build changes. Therefore, when we’re using contenthash to generate filenames, we should turn off this option to avoid unnecessary downloads.

{
...
   plugins: [
        new HTMLWebpackPlugin({
            filename: '../index.html',
            hash: false
        })
   ]
...
}

Minifying CSS

All of the CSS we’ve been generating so far has been un-minified. To minify it for production, we can use the OptimiseCssAssetPlugin.

npm install -D optimize-css-assets-webpack-plugin

We can then use it by adding its plugin to our definition

{
   ...
    plugins: [
        new CleanWebpackPlugin(['dist'], { watch: false }),
        new MiniCssExtractPlugin({
            filename: "[contenthash].css",
        }),
        new HTMLWebpackPlugin({
            filename: '../index.html',
            hash: false,
            cache: true
        }),
        new OptimizeCssAssetsPlugin({})
    ],
   ...
}

Adding a HTML Template

By default, our index.html file will be generated automatically for us. Using the HTMLWebpackPlugin, we can specify a template file for the plugin to use which may contain our initial HTML structure. Create a file in src/index.html and then add a template key to our HTMLWebpackPlugin config

{
...
    plugins: {
    ...
        new HTMLWebpackPlugin({
            filename: '../index.html',
            template: './src/index.html',
            hash: false,
            cache: true
        }),
   ...
   }
...
}
<!doctype html>
<html>
 <head>
   <title>Getting Started</title>
 </head>
 <body>
    <header>HEADER</header>
 </body>
</html>

The CSS will now be added to the head of the template, and JS files will be added at the end of the HTML tag.

Sass

Sass compiled much the same way as standard CSS, but it uses two extra Webpack plugins to achieve it, Sass Loader, and Node Sass

npm install node-sass sass-loader -D

And then create a rule in our Webpack config

{
   ...
 rules: [
       ...
            {
                test: /\.scss$/,
                use: [
                    {
                        loader: MiniCssExtractPlugin.loader
                    },
                    {
                        loader: "css-loader"
                    },
                    {
                        loader: "sass-loader"
                    }
                ]
            }
      ...
     ]
}

We can then create a Sass file in our src directory, and import it in the same way as our previous CSS file. This will then be combined into our dist folder’s output CSS.

Images

Images in our CSS can be handled using the url-loader Webpack plugin.

npm install url-loader url-loader -D

And then add a loader rule to our config

test: /\.(png|svg|jpg|gif)$/,
use: {
    loader: 'url-loader',
    options: {
        name: '[path][name].[ext]',
        context: 'src',
        limit: 9000
    }
}

This will copy images used in our CSS or SASS and place them in the dist folder. The path folder by default will include our src directory, so specifying the context: src key, will tell url-loader not to include that path in our output directory. We can also specify a limit (in bytes), which will include any images under that size inline as a data-url in our application; this is used to limit server requests for small images.

This will work for images in CSS, however images imported using javascript will not have the correct path. To fix this, we can add a publicPath declaration to our Webpack output

module.exports = {
...
    output: {
        filename: '[name].[contenthash].js',
        path: path.resolve(__dirname, 'dist'),
        publicPath: '/dist/'
    },
...
}

We can now perform imports in JS files with the correct path names, such as the following

import image from './img/test.jpg';

const img = document.createElement('img');
img.src = image;
document.querySelector('body').appendChild(img);

ES 6

To transpile our ES6 Code such as arrow functions so that they work in browsers such as Internet Explorer, we can install Babel Loader.

npm install babel-loader @babel/core @babel/preset-env -D 

and then add a rule in our Webpack config

rules: [
    {
      test: /\.m?js$/,
      exclude: /(node_modules|bower_components)/,
      use: {
        loader: 'babel-loader',
        options: {
          presets: ['@babel/preset-env']
        }
      }
    }
  ]

Webpack – Setting Up

Initialising

Webpack is installed using NPM. Initialise a new project, which will create a package.json file. Optionally use the flag -y to accept all defaults when initialising the project.

npm init

Now we can install Webpack and Webpack CLI via NPM using the following. The flag -D is for –save-dev, which adds both packages to our devDependencies.

npm install webpack webpack-cli -D

By default, Webpack 4 will use a default configuration file, so one isn’t necessary to get going. The default configuration values can be found here.

By default, Webpack will look in the folder “src” for the file “index.js”, and output to the folder “dist”, with the javascript file “main.js”. A handy Webpack config generator can be found here.

We can now create a file, index.html in the project root and link to the file “dist/main.js”.

<!doctype html>
<html>
 <head>
   <title>Getting Started</title>
 </head>
 <body>
    <script src="dist/main.js"></script>
 </body>
</html>

For our example, we’ll install Lodash, which is a utility library

npm install lodash --save

Now let’s create the entry point file in “src/index.js”, we’ll import lodash.

import _ from 'lodash';

console.log(_.join(["This", "is", "using", "Lodash"], " "));

This will import the default export from lodash in our local “_” variable. We could use any name of our choosing for this to use locally, however underscore is a convention when using this package.

To build our javascript into the dist directory, we can run

npx webpack

NPX is a tool for executing node packages. It is typically only used for single use commands, such as react-create-app. We’ll supersede its use later for our builds, but it’s useful to use it for this example. Executables which are available for use with NPX are symlinked to the node_modules/.bin directory. Running “NPX with any of the files in here can perform useful functions, e.g. mkdirp which will make directories recursively:

mkdirp /directory1/directory2

Importing Individual Methods

If we didn’t want to use the entire lodash library, we could import individual methods from it using the following:

import {join as _join} from 'lodash'
console.log(_join(["This", "is", "using", "Lodash"], " "));

However, this will still import the entire Lodash library into our project. Instead, we can use the lodash-es module which defines each method as an ES module:

import {join as _join} from 'lodash-es'

console.log(_join(["This", "is", "using", "Lodash"], " "));

This example cuts our dist/main.js size down from ~70KB to ~1KB.

Building with a separate configuration file

If we need to test something using a separate configuration file, we can use NPX to achieve this. We can create a file called webpack.test.config.js in our project’s root with the following:

const path = require('path');

module.exports = {
    entry: './src/index.js',
    output: {
        filename: 'test.js'
    }
};

Note: config files use require to include, which was introduced with Node.js some time ago, and does not use import

And then execute it with NPX:

npx webpack --config webpack.test.config.js

This will bundle our javascript in dist/test.js instead of dist/main.js.

Note that config files are standalone. So if we had a webpack.config.js in our project’s root, any config file passed to Webpack via NPX will not be merged together.

Adding a Script Shortcut

We can assign shortcut’s via NPM’s script facility. At a very basic level, we can setup a build command so that we do not have to rely on NPX. We can do that in our package.json file:

{
...
  "scripts": {
    "build": "webpack"
  },
...
}

We can then add a watch command which will automatically re-build our javascript based on changes to the project:

{
...
  "scripts": {
    "build": "webpack",
    "watch": "webpack  --watch"
  },
...
}

We can also specify other parameters, such as “mode” in our build scripts:

{
...
  "scripts": {
    "build": "webpack --mode production",
    "watch": "webpack --mode development  --watch"
  },
...
}

Note: When watching for changes, the process will have to be broken and re-started when the config file is changed

Setting publicPath

publicPath is used when resolving things like images included in CSS files. It’s prepended to these files. For this example, we’ll set it to the dist folder.

{
...
    output: {
        filename: 'main.js',
        path: path.resolve(__dirname, 'dist'),
        publicPath: 'dist/'
    },
...
}

Source Maps

To allow easier debugging of code, we can set the devtool parameter of our config to one of the sourceMap options. As Webpack itself only deals with Javascript, this will only affect JS SourceMaps, CSS and other loaders will need to be configured in their own right.

By default, setting the config’s mode to ‘development’ will enable SourceMaps, and ‘production’ will disable them, however there are numerous other options which can be useful for both production and development.

Development SourceMaps

These methods generate sourceMaps inline, in the bundled JS files, adding bloat and should not be used in production.

eval

Fast. Files are separated using the original directory structure, but the code generated by Webpack to load modules is present.

cheap-eval-source-map

Pretty Fast. Webpack loading code is removed.

cheap-module-eval-source-map

Medium Speed (but with fast rebuilds). Uses sourcemaps output by defined loaders.

eval-source-map

Slow. Adds column level mappings so inline breakpoints can be set.

Production SourceMaps

none

No SourceMaps are generated. This is the default in production, but leaves code difficult to debug when receiving user reports.

source-map

Source Maps are generated in a separate file. A comment tells dev tools (such as Chrome Dev Tools) where to find them, which are loaded when the dev tools themselves are opened.

hidden-source-map

Source Maps are generated in a separate file, but the comment to tell the browser where to find it is omitted. These can then be Loaded Manually

nosources-source-map

The same as source-map, but only gives the file name and line number. The code itself is not visible to the browser.

Cleaning The Output Folder

When changing Webpack settings, it’s often useful to get rid of old files in the output folder which may be left handing around. To do this, we can use a plugin called Clean Webpack Plugin

Install the plugin

npm install -D clean-webpack-plugin

And then add this to the plugins section of our config file:

module.exports = {
...
    plugins: [
        new CleanWebpackPlugin(['dist'])
    ]
...
}

The option passed into the first parameter of the plugin is an array of paths to be cleaned. This is required, as it does not use Webpack’s output configuration. Supplying a name like above will completely remove that folder, however we can also provide glob strings, such as dist/., which would remove the files within the folder.

This will, by default leave generated files when we’re watching files, so changes will continually generate new files in our dist folder. To re-generate whilst watching, we can use the following option

module.exports = {
...
    plugins: [
        new CleanWebpackPlugin(['dist'], {watch: true})
    ]
...
}

Overcoming Magento’s Double Grand Total Issue

Magento will sometimes calculate the total of the cart as double what it should actually be. This occurs typically when multiple collectTotals() calls are made to the shipping address. To overcome this issue, we can clear the cached items of the quote object and recalculate from there.


$quote = Mage::getSingleton(&#039;checkout/session&#039;)-&gt;getQuote();
$quote-&gt;setTotalsCollectedFlag(false);
$quote-&gt;getShippingAddress()-&gt;unsetData(&#039;cached_items_all&#039;);
$quote-&gt;getShippingAddress()-&gt;unsetData(&#039;cached_items_nominal&#039;);

$quote-&gt;getShippingAddress()-&gt;unsetData(&#039;cached_items_nonnominal&#039;);
$quote-&gt;collectTotals()-&gt;save();


WordPress is an arse and can’t do html entities properly