Drupal Planet

Lullabot: Decoupled Drupal Hard Problems: Image Styles

1 week 2 days ago

As part of the API-First Drupal initiative, and the Contenta CMS community effort, we have come up with a solution for using Drupal image styles in a decoupled setup. Here is an overview of the problems we sought to solve:

  • Image styles are tied to the designs of the consumer, therefore belonging to the front-end. However, there are technical limitations in the front-end that make it impossible to handle them there.
  • Our HTTP API serves an unknown number of consumers, but we don't want to expose all image styles to all consumers for all images. Therefore, consumers need to declare their needs when making API requests.
  • The Consumers and Consumer Image Styles modules can solve these issues, but it requires some configuration from the consumer development team.
Image Styles Are Great

Drupal developers are used to the concept of image styles (aka image derivatives, image cache, resized images, etc.). We use them all the time because they are a way to optimize performance on our Drupal-rendered web pages. At the theme layer, the render system will detect the configuration on the image size and will crop it appropriately if the design requires it. We can do this because the back-end is informed of how the image is presented.

In addition to this, Drupal adds a token to the image style URLs. With that token, the Drupal server is saying I know your design needs this image style, so I approve the use of it. This is needed to avoid a malicious user to fill up our disk by manually requesting all the combinations of images and image styles. With this protection, only the combinations that are in our designs will be possible because Drupal is giving a seal of approval. This is transparent to us so our server is protected without even realizing this was a risk.

The monolithic architecture allows us to have the back-end informed about the design. We can take advantage of that situation to provide advanced features.

The Problem

In a decoupled application your back-end service and your front-end consumer are separated. Your back-end serves your content, and your front-end consumer displays and modifies it. Back-end and front-end live in different stacks and are independent of each other. In fact, you may be running a back-end that exposes a public API without knowing which consumers are using that content or how they are using it.

In this situation, we can see how our back-end doesn't know anything about the front-end(s) design(s). Therefore we cannot take advantage of the situation like we could in the monolithic solution.

The most intuitive solution would be to output all the image styles available when requesting images via JSON API (or REST core). This will only work if we have a small set of consumers of our API and we can know the designs for those. Imagine that our API serves to three, and only three, consumers A, B and C. If we did that, then when requesting an image from consumer A we would output all the variations for all the image styles for all the consumers. If each consumer has 10 - 15 image styles, that means 30 - 45 image styles URLs, where only one will be used.

undefined

This situation is not ideal because a malicious user can still generate 45 images in our disk for each image available in our content. Additionally, if we consider adding more consumers to our digital experience we risk making this problem worse. Moreover, we don't want the presentation from one consumer sipping through another consumer. Finally, if we can't know the designs for all our consumers, then this solution is not even on the table because we don't know what image styles we need to add to our back-end.

On top of all these problems regarding the separation of concerns of front-end and back-end, there are several technical limitations to overcome. In the particular case of image styles, if we were to process the raw images in the consumer we would need:

  • An application runner able to do these operations. The browser is capable of this, but other more challenged devices won't.
  • A powerful hardware to compute image manipulations. APIs often serve content to hardware with low resources.
  • A high bandwidth environment. We would need to serve a very high-resolution image every time, even if the consumer will resize it to 100 x 100 pixels.

Given all these, we decided that this task was best suited for a server-side technology.

In order to solve this problem as part of the API-First initiative, we want a generic solution that works even in the worst case scenario. This scenario is an API served by Drupal that serves an unknown number of 3rd party applications over which we don't have any control.

How We Solved It

After some research about how other systems tackle this, we established that we need a way for consumers to declare their presentation dependencies. In particular, we want to provide a way to express the image styles that consumer developers want for their application. The requests issued by an iOS application will carry a token that identifies the consumer where the HTTP request originated. That way the back-end server knows to select the image styles associated with that consumer.

undefined

For this solution, we developed two different contributed modules: Consumers, and Consumer Image Styles.

The Consumers Project

Imagine for a moment that we are running Facebook's back-end. We defined the data model, we have created a web service to expose the information, and now we are ready to expose that API to the world. The intention is that any developer can join Facebook and register an application. In that application record, the developer does some configuration and tweaks some features so the back-end service can interact optimally with the registered application. As the manager of Facebook's web services, we are not to take special request from any of the possible applications. In fact, we don't even know which applications integrate with our service.

The Consumers module aims to replicate this feature. It is a centralized place where other modules can require information about the consumers. The front-end development teams of each consumer are responsible for providing that information.

This module adds an entity type called Consumer. Other modules can add fields to this entity type with the information they want to gather about the consumer. For instance:

  • The Consumer Image Styles module adds a field that allows consumer developers to list all the image styles their application needs.
  • Other modules could add fields related to authentication, like OAuth 2.0.
  • Other could gather information for analytic purposes.
  • Maybe even configuration to integrate with other 3rd party platforms, etc.
The Consumer Image Styles Project

Internally, the Consumers module takes a request containing the consumer ID and returns the consumer entity. That entity contains the list of image styles needed by that consumer. Using that list of image styles Consumer Image Styles integrates with the JSON API module and adds the URLs for the image after applying those styles. These URLs are added to the response, in the meta section of the file resource. The Consumers project page describes how to provide the consumer ID in your request.

{ "data": { "type": "files", "id": "3802d937-d4e9-429a-a524-85993a84c3ed" "attributes": { … }, "relationships": { … }, "links": { … }, "meta": { "derivatives": { "200x200": "https://cms.contentacms.io/sites/default/files/styles/200x200/public/boyFYUN8.png?itok=Pbmn7Tyt", "800x600": "https://cms.contentacms.io/sites/default/files/styles/800x600/public/boyFYUN8.png?itok=Pbmn7Tyt" } } } }

To do that, Consumer Image Styles adds an additional normalizer for the image files. This normalizer adds the meta section with the image style URLs.

Conclusion

We recommend having a strict separation between the back-end and the front-end in a decoupled architecture. However, there are some specific problems, like image styles, where the server needs to have some knowledge about the consumer. In these very few occasions the server should not implement special logic for any particular consumer. Instead, we should have the consumers add their configuration to the server.

The Consumers project will help you provide a unified way for app developers to include this information on the server. Consumer Image Styles and OAuth 2.0 are good examples where that is necessary, and examples of how to implement it.

Further Your Understanding

If you are interested in alternative ways to deal with image derivatives in a decoupled architecture. There are other alternatives that may incur extra costs, but still worth checking: Cloudinary, Akamai Image Converter, and Origami.

Note: This article was originally published on October 25, 2017. Following DrupalCon Nashville, we are republishing (with updates) some of our key articles on decoupled or "headless" Drupal as the community as a whole continues to explore this approach further. Comments from the original will appear unmodified.

Hero Image by Sadman Sakib. Also thanks to Daniel Wehner for his time spent on code and article reviews.

Jacob Rockowitz: Our journeys within our community

1 week 3 days ago

To begin to address sustainability in Drupal and Open Source, it’s important to explore our journeys within the community. We need to examine how we work together to grow and build our software and community.

This is going to be one of the most challenging blog posts I have ever written because I am uncomfortable with the words: roles, maintainers, contributor and mentoring. All of these words help establish our Open Source projects and communities. Over the past two years, while working on the Webform module I have learned the value of how each of these aspects relates to one another and to our Open Source collaboration and community.

Why am I uncomfortable with these words?

I am uncomfortable with these words because my general mindset and work habit are very independent and individualistic, but living on this island does not work well when it comes to Open Source. And changing my mindset and habits are things that I know need to happen.

Like many programmers, I went to art school where I learned the importance of exploring and discovering one's individual creative process. Another thing I had in common with many people who went to art school - I needed to figure out how to make a living. I went to the Brooklyn Public Library and started surfing this new thing called the World Wide Web. I was curious, confident and intrigued enough to realize that this was something I could and wanted to do - I could get a job building websites.

I built my first website, http://jakesbodega.com, using MS FrontPage while reading the HTML Bible and tinkering on a computer in the basement of my folks’ big blue house. After six months of self-teaching, I got my first job coding HTML at a small company specializing in Broadway websites. Interestingly, with the boom of the Internet, everyone's roles were constantly changing as companies grew to accommodate more...Read More

Commerce Guys: Human Presence protects Drupal forms after Mollom

1 week 3 days ago

On April 2, 2018, Acquia retired Mollom, a spam fighting tool built by Drupal founder Dries Buytaert. As Dries tells the story, Mollom was both a technical and financial success but was ultimately shut down to enable Acquia to deploy its resources more strategically. At its peak, Mollom served over 60,000 websites, including many of ours!

Many sites are looking for alternatives now that Mollom is shut down. One such service Commerce Guys integrated earlier this year in anticipation of Mollom's closing is Human Presence, a fraud prevention and form protection service that uses multiple overlapping strategies to fight form spam. In the context of Drupal, this includes protecting user registration and login forms, content creation forms, contact forms, and more.

Similar to Mollom, Human Presence evaluates various parameters of a visitor's session to decide if the visitor is a human or a bot. When a protected form is submitted, the Drupal module requests a "human presence" confidence rating from the API (hence the name), and if the response does not meet a configurable confidence threshold, it will block form submission or let you configure additional validation steps if you choose. For example, out of the box, the module integrates the CAPTCHA module to rebuild the submitted form with a CAPTCHA that must be completed before the form will submit.

We believe Human Presence is a great tool to integrate on its own or in conjunction with other standalone modules like Honeypot. Furthermore, they're joining other companies like Authorize.Net, Avalara, and PayPal as Drupal Commerce Technology Partners. Their integration includes support for protecting shopping cart and checkout forms, and we are looking for other ways they can help us combat payment fraud in addition to spam.

Learn more about Human Presence or reach the company's support engineer through their project page on drupal.org.

Acquia Developer Center Blog: Decoupling Drupal 8 with JSON API

1 week 3 days ago

As we saw in the previous post, core REST only allows for individual entities to be retrieved, and Views REST exports only permit the issuance of GET requests rather than unsafe methods as well. But application developers often need greater flexibility and control, such as the ability to fetch collections, sort and paginate them, and access related entities that are referenced.

In this column, we'll inspect JSON API, part of the surrounding contributed web services ecosystem that Drupal 8 relies on to provide even more extensive features relevant to application developers that include relationships and complex operations such as sorting and pagination.

Tags: acquia drupal planet

Virtuoso Performance: Importing specific fields with overwrite_properties

1 week 3 days ago
Importing specific fields with overwrite_properties mikeryan Tuesday, May 15, 2018 - 10:50am

While I had planned to stretch out my posts related to the "Acme" project, there are currently some people with questions about using overwrite_properties - so, I've moved this post forward.

By default, migration treats the source data as the system of record - that is, when reimporting previously-imported records, the expectation is to completely replace the destination side with fresh source data, discarding any interim changes which might have been made on the destination side. However, sometimes, when updating you may want to only pull specific fields from the source, leaving others (potentially manually-edited) intact. We had this situation with the event feed - in particular, the titles received from the feed may need to be edited for the public site. To achieve that, we used the overwrite_properties property on the destination plugin:

destination: plugin: 'entity:node' overwrite_properties: - 'field_address/address_line1' - 'field_address/address_line2' - 'field_address/locality' - 'field_address/administrative_area' - 'field_address/postal_code' - field_start_date - field_end_date - field_instructor - field_location_name - field_registration_price - field_remaining_spots - field_synchronized_title

When overwrite_properties is present, nothing changes when importing a new entity - but, if the destination entity already exists, the existing entity is loaded, and only the fields and properties enumerated in overwrite_properties will be, well, overwritten. In our example, note in particular field_synchronized_title - on initial import, both the regular node title and this field are populated from ClassName, but on updates only field_synchronized_title receives any changes in ClassName. This prevents any unexpected changes to the public title, but does make the canonical title from the feed available should an editor care to review and decide whether to modify the public title to reflect any changes.

Now, in this case we are creating the entities initially through this migration, and thus we know via the map table when a previously-migrated entity is being updated and thus overwrite_properties should be applied. Another use case is when the entire purpose of your migration is to update specific fields on pre-existing entities (i.e., not created by this migration). In this case, you need to map the IDs of the entities that are to be updated, otherwise the migration will simply create new entities. So, if you had a "nid_to_update" property in your source data, you would include

process: nid: nid_to_update

in your migration configuration. The destination plugin will then load that existing node, and only alter the specifies overwrite_properties in it.

Tags Drupal Planet Drupal Migration Use the Twitter thread below to comment on this post:

Importing specific fields with overwrite_properties https://t.co/0H3W1Ll0ts

— Virtuoso Performance (@VirtPerformance) May 15, 2018

 

Virtuoso Performance: Drupal 8 migration from a SOAP API

1 week 4 days ago
Drupal 8 migration from a SOAP API mikeryan Tuesday, May 15, 2018 - 10:12am

Returning from my sabbatical, as promised I’m catching up on blogging about previous projects. For one such project, I was contracted by Acquia to provide migration assistance to a client of theirs [redacted, but let’s call them Acme]. This project involved some straightforward node migrations from CSV files, but more interestingly required implementing two ongoing feeds to synchronize external data periodically - one a SOAP feed, and the other a JSON feed protected by OAuth-based authentication. There were a number of other interesting techniques employed on this project which I think may be broadly useful and haven’t previously blogged about - all-in-all, there was enough to write about on this project that rather than compose one big epic post, I’m going to break things down in a series of posts, spread out over several days so as not to spam Planet Drupal. In this first post of the sequence, I’ll cover migration from SOAP. The full custom migration module for this project is on Gitlab.

A key requirement of the Acme project was to implement an ongoing feed, representing classes (the kind people attend in person, not the PHP kind), from a SOAP API to “event” nodes in Drupal. The first step, of course, was to develop (in migrate_plus) a parser plugin to handle SOAP feeds, based on PHP’s SoapClient class. This class exposes functions of the web service as class methods which may be directly invoked. In WSDL mode (the default, and the only mode this plugin currently supports), it can also report the signatures of the methods it supports (via __getFunctions()) and the data structures passed as parameters and returned as results (via __getTypes()). WSDL allows our plugin to do introspection and saves the need for some explicit configuration (in particular, it can automatically determine the property to be returned from within the response).

migrate_example_advanced (a submodule of migrate_plus) demonstrates a simple example of how to use the SOAP parser plugin - the .yml is well-documented, so please review that for a general introduction to the configuration. Here’s the basic source configuration for this specific project:

source: plugin: url # To remigrate any changed events. track_changes: true data_fetcher_plugin: http # Ignored - SoapClient does the fetching itself. data_parser_plugin: soap # The method to invoke via the SOAP API. function: GetClientSessionsByClientId # Within the response, the object property containing the list of events. item_selector: SessionBOLExternal # Indicates that the response will be in the form of a PHP object. response_type: object # You won’t find ‘urls’ and ‘parameters’ in the source .yml file (they are inserted # by a web UI - the subject of a future post), but for demonstration purposes # this is what they might look like. urls: http://services.example.com/CFService.asmx?wsdl parameters: clientId: 1234 clientCredential: ClientID: 1234 Password: service_password startDate: 08-31-2016 # Unique identifier for each event (section) to be imported, composed of 3 columns. ids: ClassID: type: integer SessionID: type: integer SectionID: type: integer fields: - name: ClientSessionID label: Session ID for the client selector: ClientSessionID ...

Of particular note is the three-part source ID defined here. The way this data is structured, a “class” contains multiple “sessions”, which each have multiple “sections” - the sections are the instances that have specific dates and times, which we need to import into event nodes, and we need all three IDs to uniquely identify each unique section.

Not all of the data we need for our event nodes is in the session feed, unfortunately - we want to capture some of the class-level data as well. So, while, the base migration uses the SOAP parser plugin to get the session rows to migrate, we need to fetch the related data at run time by making direct SOAP calls ourselves. We do this in our subscriber to the PREPARE_ROW event - this event is dispatched after the source plugin has obtained the basic data per its configuration, and gives us an opportunity to retrieve further data to add to the canonical source row before it enters the processing pipeline. I won’t go into detail on how that data is retrieved since it isn’t relevant to general migration principles, but the idea is since all the class data is not prohibitively large, and multiple sessions may reference the same class data, we fetch it all on the first source row processed and cache it for reference by subsequent rows.

Community contributions

SOAP Source plugin - Despite the title (from the original feature request), it was implemented as a parser plugin.

Altering migration configuration at import time - the PRE_IMPORT event

Our event feed permits filtering by the event start date - by passing a ‘startDate’ parameter in the format 12-31-2016 to the SOAP method, the feed will only return events starting on or after that date. At any given point in time we are only interested in future events, and don’t want to waste time retrieving and processing past events. To optimize this, we want the startDate parameter in our source configuration to be today’s date each time we run the migration. We can do this by subscribing to the PRE_IMPORT event.

In acme_migrate.services.yml:

services: ... acme_migrate.update_event_filter: class: Drupal\acme_migrate\EventSubscriber\UpdateEventFilter tags: - { name: event_subscriber }

In UpdateEventFilter.php:

class UpdateEventFilter implements EventSubscriberInterface { /** * {@inheritdoc} */ public static function getSubscribedEvents() { $events[MigrateEvents::PRE_IMPORT] = 'onMigrationPreImport'; return $events; }

The migration system dispatches the PRE_IMPORT event before the actual import begins executing. At that point, we can insert the desired date filter into the migration configuration entity and save it:

/** * Set the event start date filter to today. * * @param \Drupal\migrate\Event\MigrateImportEvent $event * The import event. */ public function onMigrationPreImport(MigrateImportEvent $event) { // $event->getMigration() returns the migration *plugin*. if ($event->getMigration()->id() == 'event') { // Migration::load() returns the migration *entity*. $event_migration = Migration::load('event'); $source = $event_migration->get('source'); $source['parameters']['startDate'] = date('m-d-Y'); $event_migration->set('source', $source); $event_migration->save(); } }

Note that the entity get() and set() functions only operate directly on top-level configuration properties - we can’t get and set, for example ‘source.parameters.startDate’ directly. We need to retrieve the entire source configuration, modify our one value within it, and set the entire source configuration back on the migration.

Tags Drupal Planet Drupal Migration Use the Twitter thread below to comment on this post:

Drupal 8 migration from a SOAP API https://t.co/hf8LGiATsh

— Virtuoso Performance (@VirtPerformance) May 15, 2018

Web Wash: Managing Media Assets using Core Media in Drupal 8

1 week 4 days ago

There's a lot of momentum to fix media management in Drupal 8 thanks to the Media Entity module. By using a combination of Media EntityEntity Embed, Entity Browser and some media providers such as Media entity image you could add decent media handling in Drupal 8.

Then in Drupal 8.4, the Media Entity functionality was moved into a core module called Media. However, the core module was hidden by default. Now in Drupal 8.5 it's no longer hidden and you can install it yourself.

In this tutorial, you'll learn how to install and configure the Media module in Drupal 8 core. This tutorial is an updated version of the How to Manage Media Assets in Drupal 8 tutorial where we cover Media Entity.

Configuring Entity Embed and Entity Browser for the core Media module is essentially the same as with Media Entity. So if you have experience using Media Entity, then you'll be fine using the core Media module.

Hook 42: Giddy Up! Hook 42 Moseys over to Texas Drupal Camp

1 week 4 days ago

Dust off your saddle and get prepared to optimize your workflow. There is a lot packed into 3 days in Austin. Pull on your chaps, fasten your leathers, dig in your spurs and head on over to Texas Drupal Camp. On Thursday, make sure you check out the trainings and sprints. On Friday and Saturday, catch all of the keynotes and sessions.

Our own Ryan Bateman will be at Texas Drupal Camp to share his presentation about visual regression testing.

Texas Drupal Camp is Thursday, March 31st through Saturday, June 2nd at the Norris Conference Center in beautiful Austin, TX.

erdfisch: Drupalcon mentored core sprint - part 2 - your experience as a sprinter

2 weeks ago
Drupalcon mentored core sprint - part 2 - your experience as a sprinter 12.05.2018 Michael Lenahan Body:  Drupalcon mentored core sprint - part 2 - your experience as a sprinter

Hello! You've arrived at part 2 of a series of 3 blog posts about the Mentored Core Sprint, which traditionally takes place every Friday at Drupalcon.

If you haven't already, please go back and read part 1.

You may think sprinting is not for you ...

So, you may be the kind of person who usually stays away from the Sprint Room at Drupal events. We understand. You would like to find something to work on, but when you step in the room, you get the feeling you're interrupting something really important that you don't understand.

It's okay. We've all been there.

That's why the Drupal Community invented the Mentored Core Sprint. If you stay for this sprint day, you will be among friends. You can ask any question you like. The venue is packed with people who want to make it a useful experience for you.

Come as you are

All you need in order to take part in the first-time mentored sprint are two things:

  • Your self, a human who is interested in Drupal
  • Your laptop

To get productive, your laptop needs a local installation of Drupal. Don't have one yet? Well, it's your lucky day because you can your Windows or Mac laptop set up at the first-time setup workshop!

Need a local Drupal installation? Come to the first-time setup workshop

After about half an hour, your laptop is now ready, and you can go to the sprint room to work on Drupal Core issues ...

You do not need to be a coder ...

You do not need to be a coder to work on Drupal Core. Let's say, you're a project manager. You have skills in clarifying issues, deciding what needs to be done next, managing developers, and herding cats. You're great at taking large problems and breaking them down into smaller problems that designers or developers can solve. This is what you do all day when you're at work.

Well, that's also what happens here at the Major Issue Triage table!

But - you could just as easily join any other table, because your skills will be needed there, as well!

Never Drupal alone

At this sprint, no-one works on their own. You work collaboratively in a small group (maybe 3-4 people). So, if you don't have coding or design skills, you will have someone alongside you who does, just like at work.

Collaborating together, you will learn how the Drupal issue queue works. You will, most likely, not fix any large issues during the sprint.

Learn the process of contributing

Instead, you will learn the process of contributing to Drupal. You will learn how to use the issue queue so you can stay in touch with the friends you made today, so that you fix the issue over the coming weeks after Drupalcon.

It's never too late

Even if you've been in the Drupal community for over a decade, just come along. Jump in. You'll enjoy it.

A very welcoming place to start contributing is to work on Drupal documentation. This is how I made my first contribution, at Drupalcon London in 2011. In Vienna, this table was mentored by Amber Matz from Drupalize.Me.

This is one of the most experienced mentors, Valery Lourie (valthebald). We'll meet him again in part 3, when we come to the Drupalcon Vienna live commit.

Here's Dries. He comes along and walks around, no one takes any notice because they are too engaged and too busy. And so he gets to talk to people without being interrupted.

This is what Drupal is about. It's not about the code. It's about the people.

Next time. Just come. As a sprinter or a mentor. EVERYONE is welcome, we mean that.

This is a three-part blog post series:
Part one is here
You've just finished reading part two
Part three is coming soon

Credit to Amazee Labs and Roy Segall for use of photos from the Drupalcon Vienna flickr stream, made available under the CC BY-NC-SA 2.0 licence.

Schlagworte/Tags:  planet drupal-planet drupalcon mentoring code sprint Ihr Name Kommentar/Comment Kommentar hinzufügen/Add comment Leave this field blank

Sooper Drupal Themes: Drupal 8 vs Drupal 7 Performance 2018: Drupal 8.5 Is Faster With Glazed Theme Main Demo! | 8 Days To Drupal 8 | Day 7

1 month ago

We're counting down the days to the official SooperThemes Drupal 8 Release! Count with us as we will be writing a Drupal 8 related blog post every day for the next 8 days.

Drupal 8 is known for being heavier and slower than Drupal 7. However, Drupal 8 doesn't deserve this image and I think it got this image from lazy benchmarking: Testers just install Drupal's default profile, run some ab tests and call it a day. This is not a realistic test! Real Drupal websites will have hundreds of pages, menu items, configuration objects, more complex theming, more modules etc. 

I did some more extensive testing to compare our Drupal 7 products with the Drupal 8 versions that are to be released next week. The beautiful thing (for this test) about our products is that we maintain them with feature parity across Drupal 8 and 7. We provide installation profiles that combined have thousands of content items, menu items, and configuration settings that make them very close to real-world Drupal 8 & 7 websites. Perfect material for benchmarking!

Test 1: Drupal 8 vs Drupal 7 Cached Page Delivery

Drupal 8 default profile cached page benchmark. Drupal 7 255% faster than Drupal 8.

Ok let's get this over with first. Drupal 8 is built on a PHP core that is slower to parse than Drupal 7 and uses more memory. On an empty Drupal installation you will really notice this difference because the empty shell will act like a magnifying glass on the underlying architecture. In the chart above you can see that Drupal 7 is more than twice as fast at delivering cached pages to anonymous users. 

This test is what most people will refer to when saying Drupal 8 is slower than Drupal 7 but this is in fact the least interesting test for 2 reasons:

  1. Nowadays it is very easy to put a cache in front of Drupal. You can use Nginx, Varnish, or Cloudflare with the free plan to serve thousands of cached pages per second, regardless of whether Drupal 8 or 7 or anything else is behind the cache. 
  2. Any empty installation with Drupal's default profile is a bad model for a real website.
Test 2: Drupal 8 vs Drupal 7 Autenticated Views Portfolio Display 

Drupal 8 Glazed Main Demo views display benchmark. Drupal 8  is 228% faster than Drupal 7.

For this second test we use the Main Demo installation profile with our Glazed Theme and Glazed Builder products. This means we have a dropdown menu with 100+ items, a database with around 100 nodes, lots of views displays and contrib modules, and our Glazed theme is enabled which subthemes the bootstrap basetheme. This is a much closer simulation of a real-life Drupal website than the default profile is!

We're benchmarking a views display that draws fields from nodes and taxonomy terms, does some interesting templating and is then pulled into a page using Glazed theme to display the view along with peripheral content.

As you can see in the chart Drupal 8 is now twice as fast as Drupal 7. Despite the heavier core, Drupal 8 is faster when handling large amount of configuration, views, and heavier themes like Glazed and the Bootstrap basetheme. I suspect this is due to the more elaborate caching architecture in Drupal 8. Especially template files are now heavily cached and so are other important Drupal components.

Tested page: https://demo.sooperthemes.com/glazed-main/portfolio/premium1

Test 3: Drupal 8 vs Drupal 7 Autenticated Drag and Drop Page 

Drupal 8 Glaezd Main Demo views display benchmark. Drupal 8 12,5% than Drupal 7.

Our previous test showed an extreme example of Drupal 8 being faster due to its better handling of views. This 3rd test shows what we typically see when benchmarking Drupal 8: Drupal 8 is slightly faster in loading pages in our Glazed Theme demos. We see similar results when testing drag and drop pages, or other content types that don't use the drag and drop builder. In fact our Page Builder module does not significantly impact performance even when loading multiple Glazed Builder editors within the same page. 

The Drupal 8 and 7 installation profiles have near feature-parity but in fact the Drupal 8 version includes the Admin Toolbar module and the Drupal 7 profiles do not include an admin menu module. This means that each page comes with an additional 200 menu items in the Drupal 8 tests. In Drupal 7 I usually install the admin_menu module for faster administration, and this module is known to be significant burden on the server. The fact that Drupal 8 is faster in our tests despite loading the additional 200 menu items is impressive! 

Tested page: https://demo.sooperthemes.com/glazed-main/elements/layout-elements/columns

Conclusion: Drupal 8 Is A Heavier System With More Extensive Caching That Can Make It Faster Than Drupal 7 In Real-Life Situations

At the start of its life Drupal 8.0 got a lot of criticism for being slow. Now in 2018, Drupal 8.5 has seen a significant number of performance improvements and while it's still slower than Drupal 7 at the core, it's faster in complex situations that are more relevant to real-life Drupal websites.

Drupal 8 is faster where it matters, and more scalable! It's also important to add that both test installations did not have any special settings enabling caching of content, views, blocks, etc. Drupal 8 has a much more advanced and more granular caching system that lets you finetune and optimize your experience for logged in users on a grander scale than was every possible with Drupal 7. Notably there is the BigPipe module that gives you lightning fast loadtimes for your primary content and it can then separately lazy-load less important content, like the footer, menus, and sidebar blocks.

For sure this test brings good news to SooperThemes customers, who will enjoy a faster experience our of the box with our Glazed demo installation profiles. As a side note: importing demo content is also twice as fast in our Drupal 8 installation profiles versus Drupal 7.

What is your experience?

Drupal's performance is a complex thing to test and I'm sure you can get different results in varying situations, if you have any questions about the test or if you want to share your own experience with Drupal 8's performance let me know in the comments!

CiviCRM Blog: Building the Roparun Team Portal Part 1: Syncing civicrm participants to drupal user records

1 month ago

This is a first blog post about how we build the team portal for Roparun.

But first what is Roparun? The Roparun is a relay race of over 500 kilometres from Paris and Hamburg to Rotterdam, where people in teams, take part in an athletic event to raise money for people with cancer. It’s also called an adventure for life. This is also clear from the motto, which for years has been: ‘Adding life to days, when days often can’t be added to life’.

So each year Roparun organizes this race and around 400 teams participate in the event. The first part of the project was to setup donation functionality and that is working right now.

The next part of the project is to create a new portal for team captains where they can manage their team data, (e.g. name of the team, start location and the individual team members). We have chosen to have this in a separate Drupal website.

In CiviCRM each team captain is registered as a participant to the Roparun event with the role team captain. The team captain can login into the portal as soon as he has been registered as a team captain and till the event is over.

The first part of this project is that we wanted the team captains being able to login and we have created a module called CiviMRF User Sync. This module build on top of the CiviMRF framework.

This user sync module uses the CiviCRM api to create drupal user accounts. See screenshot below for the configuration.

What you can see is that we use a custom api to retrieve the team captains. This custom api returns the email, contact id and the team id of the team captain. We store the e-mail address as the username and at the email field at the user level.

As soon as a new team captain is registered a new user record is created and the team captain receives an e-mail with a link to create a password.

As soon as an existing team captain is removed from CiviCRM the user account is cancelled and the team captain receives an email indicating that his account is disabled.

We have also created a drupal module to store the team id at the drupal user record and use this team id in the view (see https://github.com/CiviCooP/roparun_team_portal)

So the first bit is done, meaning a team captains can log in. The next bit is to build the portal with Drupal Views and Webforms. The building blocks we are going to use for that is CiviMRF Webform, CiviMRF Views and at the CiviCRM site the form-processor. I will keep you posted on the developments of the next steps.

APIArchitectureCase studies and user storiesDrupalExtensionsInterface and designTips

Sooper Drupal Themes: Drupal 8 Adding Content And Content Types | 8 Days To Drupal 8 | Day 5

1 month ago

We're counting down the days to the official SooperThemes Drupal 8 Release! Count with us as we will be writing a Drupal 8 related blog post every day for the next 8 days.

Drupal 8 content and content types video tutorial

view on sooperthemes.com if you can't see the video

This tutorial is for people who are new to Drupal 8. We'll be showing how to add content to Drupal 8 and how to change or add new content types. Content types are very flexible in Drupal 8 and that's what makes Drupal more powerful than WordPress and other systems for many use cases. 

Drupal 8's Content Overview Page

Just like all Drupal's previous versions this administration page is the central hub where all your content appears in one place. You get there by clicking the Content link in the toolbar. This takes you to an overview of all the pages in your Drupal website. From here you can edit and delete pages using the action links to the far right side of the table. Additionally you can operate on multiple pages at a time using the controls below the Action label.

The content administration page is not just for pages, using the primary tabs you can navigate to administration pages for files and comments. This can be extended by additional moduels that provide custom content entitities using Drupal 8's Entity API. For example on sooperthemes.com we also manage domain licenses and digital downloads on separate tabs, because these are custom entity types.

Adding Content In Drupal 8

Once you're at the content administration page it's easy to see how to add content. You start by clicking the blue button that reads "+ Add content". You'll now see a listing of content types that are available at your Drupal installation and you have to tell Drupal which type of content you would like to create. After choosing a content type you'll be taken to the content form where you get to fill in all the form fields that make up your content type. You'll learn more about these fields in the next section where we discuss adding fields to content types and adding new content types.

Drupal 8 Contact Management Administration Page

Adding Content Types In Drupal 8

The ability to create new content types and choose from a large selection of different field types is what makes Drupal the system of choice for many organisations that manage a lot of content. Companies of all sizes including Tesla, Disney, United Nations, and Qualcomm use Drupal because it's the best solution for managing a large amount of content on the internet.

To manage content types in Drupal 8 click Structure in the toolbar and then click Content Types.  Now you're looking at a listing of content types installed on your website. If you just installed a new Drupal 8 website with the default profile you will see the Article and Basic Page items.  If you installed one of our theme demo profiles or the free Glazed CMS installation profile you'll have a bunch more options. Check out our YouTube video above for a quick tour. 

Once you're at content types overview click the "+Add content type" button to create a new page type. The minimum you can do here is give your content type a name, for example "Special Page, or Forum Topic". There's a number of other options for your consideration when creating a content type:

Option What it's for Description Administrative help text for content type Preview before submitting You can require or offer a preview of the content before submission Explanation or submission guidelines Additional help text for content editors Publishing options 

You can choose whether this content should immediately be published upon saving. More importantly you should also check the "Create new revision" checkbox to ensure that you can compare and revert to older version of the content in case something goes wrong when editing the content. 

The other 2 options "Sticky at top of lists" and "Promoted to the front page" are legacy options that reference different ways in which your content can be prioritized in content listings (called views in Drupal).

Display settings Choose whether to display the author and post date on this content type, which really only makes sense for blog posts or other social content. This should be disabled for most use cases.  Menu settings Selecting menus here enabled content editor to add the content items to menus on your website.

Creating A Jobs Content Type

Let's create an example content type for the new Careers section of our website. We'll need to post job vacancies, so let's call our new content type Jobs. To create a new content type go to Structure > Content Types and click the "+ Add content type".

Here we fill in the name of our content type and disable the options for author information and menu structure. After all we don't want all our job postings to clutter our menu system, you would use the views module to create an overview of available jobs.

Next click the "Save and manage fields" button. Now we're in the content type configuration page and this is where we add the fields that we need on our job vacancies. To see exactly how this works check out the video above!

Building A Content Type For Job Vacancies

Adding Visual Drag And Drop To Your Content Type

Finally we'll show you how to get even more control over the design of your content with Glazed Builder our visual drag and drop builder for Drupal 8 and 7. You can use Glazed Builder on any long-text field in Drupal on any type of entity. To enable Glazed Builder on your content type go to Structure > Content Type > Your Content Type > Manage Display. Here you can select one or more of your text fields (For example the body field) and switch the Format option from Default to Glazed Builder. Take a peak at the end of our youtube video to see the end result!

Tandem's Drupal Blog: Handling Post Migration Events in Drupal 8

1 month ago
April 20, 2018 Sometimes we need to alter data after a Drupal 8 migration has finished. With the migration events system, you can easily accomplish this. Why We need to do this One of the university clients we are helping migrate their Drupal 7 to Drupal 8 site had an interesting dilemma. They use the Flag module to mark favorite content withi...

OhTheHugeManatee: Drupal Does Face Recognition: Introducing Image Auto Tag Module

1 month ago

Last week I wrote a Drupal module that uses face recognition to automatically tag images with the people in them. You can find it on Github, of course. With this module, you can add an image to a node, and automatically populate an entity_reference field with the names of the people in the image. This isn’t such a big deal for individual nodes of course; it’s really interesting for bulk use cases, like Digital Asset Management systems.

I had a great time at Drupalcon Nashville, reconnecting with friends, mentors, and colleagues as always. But this time I had some fresh perspective. After 3 months working with Microsoft’s (badass) CSE unit – building cutting edge proofs-of-concept for some of their biggest customers – the contrast was powerful. The Drupal core development team are famously obsessive about code quality and about optimizing the experience for developers and users. The velocity in the platform is truly amazing. But we’re missing out on a lot of the recent stuff that large organizations are building in their more custom applications. You may have noticed the same: all the cool kids are posting about Machine Learning, sentiment analysis, and computer vision. We don’t see any of that at Drupalcon.

There’s no reason to miss out on this stuff, though. Services like Azure are making it extremely easy to do all of these things, layering simple HTTP-based APIs on top of the complexity. As far as I can tell, the biggest obstacle is that there aren’t well defined standards for how to interact with these kinds of services, so it’s hard to make a generic module for them. This isn’t like the Lucene/Solr/ElasticSearch world, where one set of syntax – indeed, one model of how to think of content and communicate with a search-specialized service – has come to dominate. Great modules like search_api depend on these conceptual similarities between backends, and they just don’t exist yet for cognitive services.

So I set out to try and explore those problems in a Drupal module.

Image Auto Tag is my first experiment. It works, and I encourage you to play around with it, but please don’t even think of using it in production yet. It’s a starting point for how we might build an analog to the great search_api framework, for cognitive services rather than search.

I built it on Azure’s Cognitive Services Face API to start. Since the service is free for up to 5000 requests per month, this seemed like a place that most Drupalists would feel comfortable playing. Next up I’ll abstract the Azure portion of it into a plugin system, and try to define a common interface that makes sense whether it’s referring to Azure cognitive services, or a self-hosted, open source system like OpenFace. That’s the actual “hard work”.

In the meantime, I’ll continue to make this more robust with more tests, an easier UI, asynchronous operations, and so on. At a minimum it’ll become a solid “Azure Face Detection” module for Drupal, but I would love to make it more generally useful than that.

Comments, Issues, and helpful PRs are welcome.

Acro Media: One Entry Point - Commerce for Online and Real World Transactions

1 month ago

DrupalCon Nashville 2018 Session

Join Acro Media's technical Drupal Commerce veteran, Josh Miller (all things programming) and Business Developer, Becky Parisotto (all things business) as they walk through the wild world of physical commerce that is powered by and paired with a Drupal web interface. Both Josh and Becky work together with a number of physical commerce clients. Through our client’s requirements, we have gained a better understanding of the iceberg that is building an interface for retail, and allowing for that true omni-channel experience for both customer, and (sometimes more importantly) the business owner.

Josh will review the state of Point of Sale as it integrates with Drupal Commerce 2 on Drupal 8, compare and contrast fulfillment in the new shipping and inventory modules, and talk about a new module that handles requesting products from your suppliers and updates store stock when its received. Additionally, Becky will walk us through what Drupal Commerce is capable of in the way of “powering your business” and truly being the end to end backend brain for finances, accounting, product management, customer management, shipping, fulfillment, stock, inventory and community. Drupal Commerce is a big box of legos, come and learn how we build fully integrated businesses, from the web to the storefront to the back of house, to the warehouse, and more.

This is meant to be a practical review with easy to digest client examples and micro case studies of how we merge an online tool with a physical store. Setting clients in digital stone, all powered by Drupal.

Talk to us

Acro Media is a Drupal Commerce development agency that specializes in enterprise-level ecommerce. We are committed to building strong strategic partnerships and using our ecommerce expertise to help clients create a dynamic web presence that engages audiences, generates revenue, and boosts brand awareness.

Checked
5 days 1 hour ago
Drupal.org - aggregated feeds in category Planet Drupal
Subscribe to Drupal Planet feed