Drupal Planet

Subscribe to Drupal Planet feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 2 hours 40 min ago

Palantir: The Lowdown on DrupalCon Baltimore

May 9, 2017 - 6:33pm
The Lowdown on DrupalCon Baltimore brandt Tue, 05/09/2017 - 16:33 Alex Brandt May 10, 2017

Our favorite parts of DrupalCon Baltimore.

In this post we will cover...
  • Our favorite events from DrupalCon

  • Links to Palantir sessions

Stay connected with the latest news on web strategy, design, and development.

Sign up for our newsletter.

DrupalCon is always a positive experience for the Palantir team, largely because of the Drupal community itself. Our week in Baltimore was filled with engaged conversations, thoughtful sessions, and much appreciated down time with friends we don’t get to see often enough.

DrupalCon by the Numbers
  • Palantiri in attendance: 14
  • Palantiri sessions: 3
  • Client meetings: 7
  • Coffees consumed: at least 2 dozen
  • Newsletter sign-ups: 240
  • Podcasts recorded: 2
  • Late nights: 2 many

“It was a wonderful first DrupalCon experience because of a great community that is so supportive and accepting of newcomers, regardless of their level of Drupal knowledge.” - Annie Schow

Highlights by Day

Monday: We ate all the crabs.
Following the opening reception in the exhibit hall, we ate dinner as a team at Riptide by the Bay In historic Fells Point. An impressive amount of crabs were consumed.

Tuesday: #PMTheMusical!
We witnessed another standing ovation for Joe Allen-Black and Allison Manley and their performance of Project Management: The Musical! There were quite a few crowd favorites, and Joe and Allison were both happy to share their final performance of this presentation in front of an energetic DrupalCon crowd.

Wednesday: Inclusion Initiative
We partnered with another Chicago-based agency, Digital Bridge, to coordinate a Drupal training session for five students local to Baltimore who were unfamiliar with Drupal. We’re looking forward to expanding the program in the future. Keep an eye out for more details on that later this month!

Thursday: #ContentBeforeCode, #DevTeamCollab and Trivia Night
If you missed them at MidCamp, Megh Plunkett, Michelle Jackson, and Bec White did round two of their sessions on Thursday (recordings linked above). Michelle and Bec’s session will also be available via a webinar later this summer, so stay tuned for your chance to sign up in case you missed it at DrupalCon.

Palantir also hosted Trivia Night at Baltimore Soundstage. We’re not sure the wait staff knew what was happening as over 400 people were tasked with answering some fairly obtuse and nerdy questions about this mysterious Drupal thing, but they kept everyone hydrated so we could enjoy the fun. Jeff Eaton killed yet again as the emcee for the evening. 

Friday: Exploring Baltimore
As people shuffled to the airport, a few Palantiri were able to squeeze in a last minute trip to the Baltimore National Aquarium. Thankfully not one Palantir was lost to sharks.

Thanks for a great week Baltimore. We’ll see you next time, DrupalCon!

We want to make your project a success.

Let's Chat.
Categories: Blogs

Acquia Developer Center Blog: Managing Drupal Sites with Composer

May 9, 2017 - 3:56pm

Talking through the growing pains of using Drupal with Composer dependency management at DrupalCon Baltimore. Drupal gets better when companies, organizations, and individuals build or fix something they need and then share it with the rest of us. Open source technologies become better, stronger, and smarter when others take it upon themselves to make a positive difference contributing their knowledge, time, and energy to it. Acquia is proud to play a part, alongside thousands of others, in making tomorrow’s Drupal better than today’s. One of the people making a difference is Jeff Geerling.

Tags: acquia drupal planetComposerdependency managementdrupalcon
Categories: Blogs

Ben's SEO Blog: How to Improve Drupal 8 Website Performance

May 9, 2017 - 1:30pm

Getting faster page load speeds isn’t just about increasing your Google rankings. It’s about improving customer satisfaction and gaining new revenue. Using tools specific to Drupal along with other universal actions, you can reach your marketing goals faster than ever.

It’s no secret that page loading speed matters to Google rankings. Speed became a ranking factor in 2010 and since that time, developers and marketers have been looking for ways to increase it. Google cares about page speed because the search engine is focused on delivering the best onsite user experience possible. As a result, Google rewards fast-loading websites with better rankings. (The converse is not always true. Slow page loading times will only negatively impact your site rankings if it is very slow.)

As a marketer, your goal really isn’t better Google rankings. You are looking for the result of those rankings—more website visitors, more leads and more revenue. As a marketer, fast page load times aren’t the goal either; but, a means to improve your users’ experiences. Better website interactions can result in greater satisfaction, more conversions, and higher sales.

Faster Page Load Time Results in Greater Revenue

Faster page navigation means that users may see more page views each time they visit your site. Having a fast website means that users can quickly understand your offering and purchase your products.

Studies show that faster page speed results in greater revenue.

  • Both Amazon and Walmart, in separate studies, attribute additional sales revenue from faster page speeds. Their revenue grew by 1% for every 100ms of page speed improvements. For Amazon, slowing down their page load time by just one second could result in a loss in revenue of $1.6 billion. That’s a lot of zeroes for a measly second.
  • Shopzilla increased revenue by 12% and page views by 25% by speeding up their page load time to 1.2 seconds from 6 seconds.
Customer Satisfaction Increases with Faster Page Speed

Faster websites mean happier customers. Particularly, studies have shown that:

  • A one-second delay in page-load time leads to a drop in pageviews (11%), conversions (7%), and customer satisfaction (16%), according to the Aberdeen Group.
  • Econsultancy research found that 47% of consumers expect to wait no longer than two seconds for a web page to load. Additionally, 88% of people who experience a dissatisfying visit due to page load times are less likely to shop from that site and more than a third will tell their friends about the bad experience.
  • According to KISSmetrics, 18% of mobile users will abandon a website if it doesn’t load in less than five seconds. If it takes more than 10 seconds to load, 30% will abandon the site.
Is Your Website Fast Enough?

The evidence shows page speed matters. Is your website fast enough? At a minimum you should aim for under 2 seconds. For e-commerce sites, you should have even faster goals.  Google’s goal is 100ms—faster than the blink of an eye.

It’s quite simple to test your website speed.  You can use Google’s PageSpeed Insights tool and WebPageTest.org to take a benchmark of how your website performs. If your pages load in more than two seconds or if you haven’t met your page loading goals, you should consider taking some of the steps below.

Ways to Increase Your Drupal Website Performance

There are general ways that every website manager can implement to speed up page loading, but there are also specific Drupal tools and modules to know and implement. I’ll address both of these.

1. Keep it simple.

Page speed starts with choosing a design that is clean and fast. By reducing the number of components on your page, and keeping widgets and embedded media to a minimum, you are on the way toward a lightning fast website.

2. Cache your pages.

Drupal 8 enables caching by default for anonymous visitors. That is sufficient for small to medium sized websites with moderate traffic. You can select the maximum age for your page caching based on how quickly your website content changes.

1 day - good for websites that are only updated a couple of times per week. There is no commenting or other interaction on the site. (lead generation brochure site)

1 hour - good for websites that are updated once or twice per day. (ecommerce)

15 minutes- good for frequently updated websites. (news sites)

If you use Drupal 8 Views or Panels, you can get more fine-grained in your cache settings. The caching on each individual block can be customized as well.

3. Optimize your website to work with different devices and browsers.

You can no longer only optimize speed for desktops. With 60% of online traffic coming from mobile devices, a mobile responsive website is critical. All of the things that you do to speed up your website will help, of course. There are also things that you can do specifically to make your website more responsive for mobile devices. You should make sure that your website is optimized to work with popular mobile browsers. One of the most powerful things you can do is to implement the Google AMP module. We talk in detail about it in our article, How Marketers Use Drupal's AMP Module to Improve Google Search Rankings.

4. Compress your images, CSS and Javascript files.

The Advanced CSS/JS Aggregation module aggregates and compresses CSS and Javascript files to make your site run faster. Google loves fast websites and this module speeds things up with little overhead.

Drupal 8 Core has the ability to resize images and serve the right image for any situation. It can scale them, crop them, and much more. Consistent image sizes help reduce the bandwidth required to load a particular web page. This can greatly reduce load time.

5. Use a Content Distribution Network (CDN) with Drupal 8

There are a few third party tools that you can use with Drupal to speed up your website. A CDN stores your website on servers across the globe. CDN companies own data centers on every continent and in every region. Think of it as taking your server cache and making copies of it to servers that are a lot closer to your visitors. If the HTML does not need to be recreated by Drupal, then it is served directly from the CDN, greatly reducing the load times involved.

Example companies include (my personal favorite) CloudFlare, Level3, Amazon, and Akamai. The CDN module for Drupal is located here. There are also service-specific modules for CloudFlare and
Akamai.

6. Choose a host that can offer greater speed

If, after implementing some of the tips above, you are still not meeting your performance goals, you should consider choosing a faster host. When you look for hosting, you’ll find many options. At Volacci, we have experience with several dozen hosting companies that promise Drupal support and high speeds. In order of most capable to least capable (with considerable overlap in performance and cost), here is a list of the types of hosting you may want to consider:

Managed Dedicated Server(s)

A managed dedicated server takes care of all your hosting needs for you. Not only do you get the hardware but you get a team of experts to make sure everything is running as it should. They will keep your software up-to-date and alert you if there are any problems – often after they’ve already been fixed. You can deploy multiple servers in many configurations. For example, a firewall, caching server, database server or multiple http servers could all be part of a larger solution. It’s fast and reliable hosting. Adding multiple servers or getting help designing the perfect configuration for you is part of the service. Blackmesh (my personal favorite) is the Drupal-specific company for this kind of hosting. Also consider Rackspace.

Dedicated Server(s)

A dedicated server provides low latency which means a fast response time for most small to medium-sized sites. Consider that you need to provide technical staff to manage the hardware and software stack. Examples include HostGator and Viawest.

Cloud Hosting

Cloud hosting is scalable. The “cloud” means that there is a data center with lots of dormant servers. As your site’s needs scale up (or down) the servers in the data center respond with more server power. While it may be a panacea for some, latency and cost are critical concerns. Examples include Acquia Cloud, Pantheon, and Platform.sh.

Virtual Private Server (VPS) / Server Slice

A VPS offers a good balance between cost and performance. It’s similar to shared hosting in that you share a single server with other tenants. However, you get a guaranteed amount of performance on that server. Maybe 10% (or more) dedicated to you which preserves your performance. Examples include HotDrupal and Green VPS.

Shared Hosting

Shared hosting is the rookie league of hosting. Your site sits on a server with many other tenants. It’s slow and not scalable but it’s inexpensive. Examples include Bluehost and SiteGround.

Learn More with Drupal 8 SEO

If you would like specific details on how to speed up your Drupal 8 website and optimize it for higher Google rankings, take a look at Drupal 8 SEO. This book is the definitive authority on SEO for Drupal 8 websites. Check it out.

Contact Volacci if you would like our Drupal SEO experts to create a plan and implement best practices that will maximize your website performance and improve Drupal SEO.

Surprising Marketing Benefits of Increased Web Page Loading Speeddrupal 8, drupal websites, Planet Drupal
Categories: Blogs

Aten Design Group: Testing for the Brave and True: Part One

May 9, 2017 - 12:16pm

This is the second part of a series of blog posts about automated testing for Drupal. Its mission is to take you from zero testing experience to confidence in testing your custom Drupal work, from the ground up. Last time, in Testing for the Brave and True: Part Zero we defined exactly what automated testing is and discussed some of the common vocabulary of testing. It also introduced the two primary tools used by the Drupal community to test their work, PHPUnit and Behat.

Why Automated Testing Will Save You Time and Treasure

Now that we know a little about what automated testing is, I'm going to make the case that it can be a net positive to your everyday workflow and actually make you a better programmer.

Everybody Tests

If you came to this blog post with the idea that you've never done any testing or that you've tried to test and didn't succeed, you'd be wrong. Every developer tests their code. Some developers just throw that work away.

Consider what you're doing every time you clear cache and go refresh your browser. You're testing your work. You've made some change to your code and now you're asserting that your work functions as you expect. Perhaps you put a dpm() or kint() in your new code to inspect some part of your code or a variable, or maybe you're using XDebug (if not, I'd encourage you to start) to step through your code. This process is testing.

While these informal tests can be incredibly valuable, you can't commit them; you can't run them the next day and you cannot run all the tests you've ever written with just one command. Writing automated tests is simply writing code that can do some of that testing for you. It's making those informal tests you already do, explicit and formalized.

Context, context, context

Whenever you write code to do specific things, you make assumptions. Assumptions are the foundation of abstraction and abstraction is the foundation of progress. You can't write code that does anything useful without making assumptions. Especially in Drupal. Entities themselves are an abstraction writ large. But, wrong or hidden assumptions are also the root of most bugs.

Therefore, when we write code, we ought to be very aware of the assumptions we make. We ought to record those assumptions in some way, for future maintainers or simply to help us remember that we made them in the first place. Unfortunately, when we only do informal testing, we bake our wrong assumptions into our code without leaving a record of them. We can't re-assert our assumptions later without digging through code or comments or taking the time to figure out what something was actually supposed to do.

This is the first place where formal tests can be a boon to you, future you, and your successors. The act of writing formal, automated tests by its very nature is recording your assumptions for posterity. When you return to your code an hour, day, week, or year later, all the assumptions you made can be tested again. If you have a strange if/else in your code because of some edge case you discovered when you were doing your initial development, a test can ensure that that code isn't deleted when you're cleaning up or refactoring later (at least without explicitly deleting the test).

In short, you make your assumptions explicit. This reduces the cognitive burden of "getting back up to speed" whenever you need to come back to some piece of code.

Confidence

This is where I first really fell in love with testing. Having formal tests for the code I was working with gave me confidence as I made changes. That can sound strange to someone who's never tested before. It sounded strange to me, too.

The confidence I'm talking about is not the confidence I have in my abilities (Lord knows I could learn a little more about being humble), it's my confidence in the codebase itself and the relative safety I have when I incorporate a change.

If you've ever been in an old, large, legacy codebase, you might recognize that feeling of mild anxiety when you've made a change and there's just no feasible way to know if you've broken something else in some obscure place of "the beast". The only thing you can do is click around and cross your fingers. This is where a well-tested codebase can create real confidence. Having a suite of automated tests means I can isolate my changes and then run all the tests ever written for that codebase and ensure that my changes haven't broken something, somewhere.

Better tests, better code

If you've been interested in the art of programming itself (and I think you must be to be reading this), then you might have heard of the SOLID design principles. Or, at least, things like "write small functions" and "do one thing and one thing well." Maybe you've heard about "dependency injection," "separation of concerns," or "encapsulation." All these words are names for the concepts that, when applied to the way we write code, make the likelihood of our code being robust, flexible, extensible, and maintainable (all good things, right?) go up.

The art and practice of testing itself can help you apply all of these concepts to your code. If you recall the term "unit testing" from the last post in this series, I said, "[unit] tests isolate very small bits of functionality." The process of identifying the one small thing that your code achieves in order to test it, helps you apply the Single Responsibility Principle. Put another way, when your tests become large and unwieldy, they're saying to you, "this code does too much and it should be refactored."

When you're testing code that has dependencies on other code or configuration, like access to the database, another service, or some credentials, it can become difficult to write useful tests. For example, if you're writing code that runs an entity query and you'd like to test how the code works when there are no results, five results or 500 results, you would have a hard time doing so with a real entity query and database connection. This is where "inversion of dependencies" or "dependency injection" come into play. Instead of running an entity query and doing processing on the results all in one function or within a single class, pass the entity query or its results into the function, method or class. This allows you to test the function with fake results, which you can then set up in your test (we'll go over the methods for doing exactly that in a later part of this series).

That inability to test code with implicit dependencies is a good thing™—it forces you to do dependency injection, whereas it's simply a ritual that you have to practice without tests (I should note, the reason inversion of dependencies is a good thing™ is because it makes your code modular and helps ensure it only "does one thing well").

What's next?

I hope I've made a convincing case that writing automated tests for Drupal can save you time and treasure. In the next part of this series, we're going to begin our descent into the art of testing itself. We'll go over writing our first unit test and getting it running on the command line. Until then, feel free to comment or tweet @gabesullice if you've got questions!

Categories: Blogs

Code Positive: Atomic Design

May 9, 2017 - 9:00am

The main idea behind Atomic Design is to think about components in their smallest, simplest elements (such as a menu item or a search button) first and building up from there - to design from element upwards rather than starting with page level wireframes.

READ MORE

 

Categories: Blogs

Agiledrop.com Blog: AGILEDROP: Case Studies on DrupalCon Baltimore

May 9, 2017 - 2:44am
There was an enormous amount of sessions in the past DrupalCon. They are available online. But to make things easier for you, we'll simply group them together and add a little overview, so you'll easily pick the ones that you like. We'll start with case studies on DrupalCon Baltimore.   Building NBA.com on Drupal 8 by Tobby Hagler from Phase2 and Josh Mullikin from Turner A session gives an overview of NBA.com, the reasons why Drupal 8 was chosen for the 2016-2017 season and how Drupal 8 interacts with other systems and stack components. Attendees learned what worked, what they should… READ MORE
Categories: Blogs

Dries Buytaert: 7-Eleven using Drupal

May 8, 2017 - 10:23pm

7-Eleven is the largest convenience store chain in the world with 60,000 locations around the globe. That is more than any other retailer or food service provider. In conjunction with the release of its updated 7-Rewards program, 7-Eleven also relaunched its website and mobile application using Drupal 8! Check it out at https://www.7-eleven.com, and grab a Slurpee while you're at it!

Categories: Blogs

Cocomore: Recap: Drupal Camp 2017 in Frankfurt

May 8, 2017 - 7:00pm

At this year's DrupalCamp in Frankfurt, we have not only been excited participants, but as part of the organization team we have also been responsible for the smooth running of the event, the coordination on site and the planning before the actual event took place. In this blog article our colleague Ela summarized which tasks we have undertaken and which topics have been on this weekend's Drupal agenda.

Categories: Blogs

Promet Source: Drupal Training Course Announcement: Drupal 8 Developer Immersion, Learn Drupal 7 Confirmed for May, June

May 8, 2017 - 6:18pm
If your development team is ready to dive into Drupal 8 or Drupal 7 best practices, take note!  Promet Training has confirmed course dates coming up in May and June for our Drupal 8 Developer Immersion and Learn Drupal 7 courses, both in person and live online.
Categories: Blogs

Mike Crittenden: Grav CMS for Drupal developers

May 8, 2017 - 6:13pm
Grav CMS for Drupal developers

If you've never heard of it, Grav is a pretty neat little flat-file CMS. If you're a Drupal developer, words like "flat-file" and "neat" and "little" are probably foreign to you. This post is an attempt to explain what Grav is, why it's neat, and how to use it, in terms that you'll understand.

First of all, where is the database?

As a Drupal developer, you live and die by the database. You've probably worked on sites that have had many hundreds of database tables. You might even remember the first time you realized that each field gets 2 database tables of its own.

The first thing you should understand about Grav is that there is no database. In place of it, there are 2 types of things:

- YAML files which hold configuration

- Markdown files which hold content

That's it. If you want to make a change to config, you change it in the relevant YAML file. If you want to update a page, you change it in the relevant Markdown file.

Oh, so it's a static site generator like Jekyll? No!

So far it may sound like a static site generator, but it's not. It's a CMS. This means that it can still do all the same types of things other CMS'es can do, that aren't available to static site generators.

For example, there's a really nice admin plugin that lets editors edit content via a UI, and upon saving, the content is instantly updated on the site (rather than the site needing to be re-built). Some static site generators have UI's, but they still require the intermediary site-generation step after making an edit.

You can also still have dynamic content listings, send emails, redirect users, integrate with web services, display user-facing forms, etc., since Grav is built with PHP and is super duper alterable via custom plugins. You'd need to handle that stuff client-side with a static site generator.

Content types in Drupal = Page Types in Grav

Let's start with the basics - the age old "content type." In Drupal, creating a content type happens in the UI.

In Grav, to create a content type, you just create a “whatever.html.twig” file in the templates/ directory of your theme. Doing that automatically tells Grav that “Whatever” should be a new Page type.

This means that when creating a page in the UI, you can choose the “Whatever” page type. Or, if you’re creating content via adding a Markdown file directly, just name the file whatever.md which tells Grav that it’s a “Whatever” type of page.

Read the docs on this.

Custom fields in Drupal = Blueprints in Grav

In Drupal, creating custom fields happens in the UI.

In Grav, to create custom fields for a given page type, you’ll do it in a YAML file. Grav calls this a “Blueprint”. Just create a file in /user/blueprints/pages/PAGETYPE.yaml and throw in something like this:

title: PAGETYPE '@extends': type: default context: blueprints://pages form: fields: tabs: fields: content: fields: header.heading: type: text label: Heading header.subheading: type: text label: Subheading

Basically, that will add two new text fields (“Heading” and “Subheading”) to the “Content” tab of the form for that page type.

When you save that form, it’ll throw that data into a little YAML block at the top of the Markdown file that stores the content of that page. This is called Frontmatter or Headers and is actually really really cool because it means that the sky is basically the limit in terms of how to store structured data. You can store it in any way that YAML supports.

Then, in the Twig template (we’ll get to templates later), you can output the data for those custom fields using {{ header.heading }} or {{ header.subheading }}.

Read the docs on this.

Views in Drupal = Page Collections in Grav

In Drupal, creating a content listing happens (usually) in the Views UI.

In Grav, there’s the concept of a “Collection” which allows you to loop through and list arbitrary content. Here’s an example:

content: items: @self.children order: by: date dir: desc limit: 10 pagination: true

And then in the Twig template, you’d just loop through them like so:

{% for p in page.collection %} {{ p.title }} {{ p.summary }} {% endfor %}

Collections support lots of the same filtering/sorting/pagination concepts that Views supports. Some of the more complex stuff (such as fields from relationships or exposed filters) would have to be custom built via a plugin, but this should handle most of the things you’d typically use Views for pretty well.

Read the docs on this.

Taxonomy in Drupal = Taxonomy in Grav

Yep, it’s even named the same thing for you.

In Drupal, creating a Taxonomy happens in the blah blah blah you get the idea. All of this stuff is done in the UI in Drupal.

In Grav, creating a Taxonomy just means adding it to your site.yaml config file, like so:

taxonomies: [category,tag]

Just add it to that array and you’ve created a new taxonomy. Then, you can reference it from any given page like this, in the YAML Frontmatter:

title: Post title taxonomy: tag: [animal, dog] category: pets

And that’s it. Taxonomies are MUCH simpler in Grav than in Drupal. They aren’t fieldable, for example (without some customization). They’re basically just a way to group content together, so that you can create listings (“Collections”) out of them.

Read the docs on this.

Configuration/CMI/Features in Drupal = YAML files in Grav

In Drupal, configuration is stored in the database. Drupal 8 provides core with the ability to sync this configuration with YAML in the filesystem, but the source of truth is the database.

This means that if you want to push some new configuration some site A to site B, you have to make the change in the UI, export it to YAML, move that YAML to the other site (via a git push or some other mechanism), and import it on the other site to make it live. People usually use Drush or Features to help with this process.

In Grav, the source of truth for configuration is the YAML itself, since there’s no database. To change configuration, just change the YAML file, and Grav will immediately recognize that. To move that change to another site, just git push/pull it and it’s live.

Read the docs on this.

Install profiles/distributions in Drupal = Skeletons in Grav

This is one area where Grav really shines.

In Drupal, shipping a distribution mostly involves doing work to make sure that a site has everything it need in code and exported configuration, and installs correctly using the installer. This is a result of Drupal relying on a database, but not wanting to ship an exported copy of that database with the distribution.

In Grav, since there’s no database, a “distribution” (or a "Skeleton" in Grav-speak) is basically just a copy of the codebase. Grav has no notion of "installation" like Drupal's installer. Just copy the codebase to another web root somewhere and it’s ready to run. This means that it’s really easy to ship open source Skeletons, many of which are available here.

(It’s a tiny bit more nuanced than that since all you really need is the /user directory of the codebase which is where all the custom code is stored, but you get the idea).

Read the docs on this.

Drush in Drupal = CLI tools in Grav

Drush has saved the butt of many a Drupal developer. These days, Drupal Console is doing pretty well for itself too, but it’s the same basic idea. Talking to your site via the CLI is useful.

Grav has a couple built in CLI tools for many of the same purposes:

  • bin/grav: performs basic site tasks such as clearing cache, making backups, installing dependencies, or creating new projects
  • bin/plugin: performans commands provided by plugins (instead of Grav core), such as creating new users via the admin plugin
  • bin/gpm: (“Grav Package Manager”) - performs tasks you would expect of a package manager, such as listing, downloading, and updating plugins
Other random stuff

Here’s some other stuff that didn’t really deserve its own section. Feel free to read up on the docs on these if you’re curious.

Shortcomings and Downsides

There are a few things to keep in mind if you’re looking at using Grav for a project instead of Drupal.

One is that Grav doesn’t scale nearly as well. Many Drupal sites have many millions of nodes, thanks to the usage of a database. In general, I probably wouldn’t suggest using Grav once you start getting into the thousands with page count. Performance will likely start to suffer.

Drupal also really shines in creating complex content models, where there are many types of nodes/entities which reference each other or embed each other or reuse each other's fields, etc. Grav is perhaps more "page focused" than "data focused", which makes it much easier to work with for many sites, but not a great fit for some sites that need those complex relationships.

Grav also doesn’t really have the notion of an editorial workflow or moderation system. It does support published vs. unpublished, and there are things like Git Sync to auto-deploy from a staging environment (or your local site) to a production environment if you set it up to do so, but there’s no approval process along the lines of what Drupal and some modules can provide.

Also, Grav doesn’t have anything to match the Paragraphs module, which allows you to build content by placing arbitrary “slices” in an arbitrary order. It does have a “List” field type which allows you to add as many “field groups” as you want, but each group must have the same set of fields. So you can’t, for example, add a text slice, then a video slice, then an image slice, then another text slice, etc.

Obviously, Grav also isn’t going to have anywhere near the amount of 3rd party plugins (modules) that Drupal has. Things like integration with web services or commonly used libraries will have to be hooked up yourself, more often than not. That said, the API is solid and the documentation for it is legit.

That’s by no means an exhaustive list, but it's about all I’ve found so far. For your typical small to medium sized sites, Grav can be a really great solution that cuts out some of the overhead of a typical Drupal site. Recommended!

mcrittenden Mon, 05/08/2017 - 17:13
Categories: Blogs

Drupal Modules: The One Percent: Drupal Modules: The One Percent — A Simple Timeline (video tutorial)

May 8, 2017 - 3:02pm
Drupal Modules: The One Percent — A Simple Timeline (video tutorial) NonProfit Mon, 05/08/2017 - 13:02 Episode 27

Here is where we bring awareness to Drupal modules running on less than 1% of reporting sites. Today we'll consider A Simple Timeline, a module which renders the results of a View in a vertical timeline.

Categories: Blogs

Agaric Collective: Avoid sending emails while doing a migration on Drupal 8

May 8, 2017 - 1:34pm

On a migration, Drupal read from an external source the data and create content in our new Drupal Site, and while do that Drupal execute any hook/event related to the new content. So any hook_entity_insert is triggered for every new entity saved on our new site.

This can be a problem if we have some features in our new site which are executed when a new content is created (like send a tweet or send an email) when we run the migration we will have a ton of emails or tweets of the old content and usually, that is not the expected behavior.

Fortunately, in Drupal 8 the migrations are Events and we can create an EventSubscriber (more about EventSubscribers here) which will allow us to create a flag before the migration run so we can determine in our code if the entity has been created in a migration or not.

The main idea was taken from this Moshe Weitzman gist (Thanks!) I will add just the missing parts.

First, we generate all the event subscriber related files using this Drupal Console command:

drupal generate:event:subscriber

The console will ask some question (in which module we want to generate the EventSubscriber and the name of the service)

Enter the module name [config_log]: > your_module Enter the service name [simple_faq.default]: > migration_events.subscriber Class name [DefaultSubscriber]: > MigrationEvents Enter event name [ ]: > Do you want to load services from the container (yes/no) [no]: > no Do you confirm generation? (yes/no) [yes]: >yes

This will generate two files:

modules/custom/your_module/your_module.services.yml

Which basically let drupal know that we have a Subscriber there which needs to be executed and:

modules/custom/your_module/src/EventSubscriber/MigrationEvents.php

With this content:

namespace Drupal\simple_faq\EventSubscriber; use Symfony\Component\EventDispatcher\EventSubscriberInterface; use Symfony\Component\EventDispatcher\Event; /** * Class MigrationEvents. * * @package Drupal\simple_faq */ class MigrationEvents implements EventSubscriberInterface { /** * Constructs a new MigrationEvents object. */ public function __construct() { } /** * {@inheritdoc} */ static function getSubscribedEvents() { return $events; } }

On this file we need to add our flag which will indicate drupal that we are running the migration. First, we need to import the Migrate events:

use Drupal\migrate\Event\MigrateImportEvent; use Symfony\Component\EventDispatcher\EventSubscriberInterface; use Drupal\migrate\Event\MigrateEvents;

and After add our methods:

protected $staticCache; public function __construct() { $this->staticCache = &drupal_static("your_migration"); } /** * {@inheritdoc} */ public static function getSubscribedEvents() { return [ MigrateEvents::PRE_IMPORT => 'onMigratePreImport', MigrateEvents::POST_IMPORT => 'onMigratePostImport', ]; } /** * @param \Drupal\migrate\Event\MigrateImportEvent $event * Import Event. */ public function onMigratePostImport(MigrateImportEvent $event) { if ($event->getMigration()->getBaseId() == "your_migration") { $this->staticCache = FALSE; } } /** * @param \Drupal\migrate\Event\MigrateImportEvent $event * Import Event. */ public function onMigratePreImport(MigrateImportEvent $event) { if ($event->getMigration()->getBaseId() == "your_migration") { $this->staticCache = TRUE; } }

And that's it, now we have a flag which we can use to determine if we are running the migration or not, the complete class look like this:

namespace Drupal\your_module\EventSubscriber; use Drupal\migrate\Event\MigrateImportEvent; use Symfony\Component\EventDispatcher\EventSubscriberInterface; use Drupal\migrate\Event\MigrateEvents; /** * Event subscriber to avoid sending emails/tweets/facebook posts on migrations. */ class MigrationEvents implements EventSubscriberInterface { /** * The drupal_static cache. * * @var array */ protected $staticCache; /** * CommentEventSubscriber constructor. */ public function __construct() { $this->staticCache = &drupal_static("your_migration"); } /** * {@inheritdoc} */ public static function getSubscribedEvents() { return [ MigrateEvents::PRE_IMPORT => 'onMigratePreImport', MigrateEvents::POST_IMPORT => 'onMigratePostImport', ]; } /** * @param \Drupal\migrate\Event\MigrateImportEvent $event * Import Event. */ public function onMigratePostImport(MigrateImportEvent $event) { if ($event->getMigration()->getBaseId() == "your_migration") { $this->staticCache = FALSE; } } /** * @param \Drupal\migrate\Event\MigrateImportEvent $event * Import Event. */ public function onMigratePreImport(MigrateImportEvent $event) { if ($event->getMigration()->getBaseId() == "your_migration") { $this->staticCache = TRUE; } } }

And finally, We now can use this variable to determine if we should send that email when creating a new entity, for instance:

/** * Implements hook_node_insert(). */ function yourmodule_node_insert($entity) { // If the migration is running, just return without doing anything. if (drupal_static('your_migration', FALSE)) { return; } // All your code for send emails/tweets here. // . . . }

And that's it.

Here we used drupal_static to preserve the value through the execution of the migration if you want to read more about it check here

Categories: Blogs

Dale McGladdery: H5P - Portable Interactive Content in Drupal

May 8, 2017 - 1:24pm

H5P is an open source platform-independent authoring and display system for interactive content. Presentations, quizzes, and other interactive content can be created and displayed using building blocks known as H5P content types (different from Drupal content types). Once a piece of content is created it's easily exported to another H5P system. The development environment is open and well documented, allowing the creation of custom H5P content types.

H5P attributes include:

  • Available in Drupal 7, WordPress, and Moodle
  • Open Source
  • Content is exportable to any other H5P system
  • Uses JavaScript and HTML 5
  • Results tracking for content types such as quizzes
  • xAPI (Tin Can) integration
  • Drupal 7 hook system integration
  • Drupal development environment

Unfortunately there is no Drupal 8 version yet.

There are a variety of H5P content types, including containers such as accordions and sliders which can nest other content types. Some examples are:

  • Arithmetic Quiz
  • Course Presentation
  • Dialog Cards
  • Drag the Words
  • Fill in the Blanks
  • Timeline
  • Interactive Video

The complete list is at https://h5p.org/content-types-and-applications

H5P defines a file packaging format named the ".h5p specification", or simply, H5P file. An H5P file is a zip archive bundling HTML, JSON, JavaScript, and media files. It can contain one or more of a content type, content export, API implementation, application, or JavaScript library.

Drupal Integration

H5P is installed in Drupal in two steps.

  1. Drupal H5P module
    The H5P module is installed using the standard module installation process. It handles the Drupal integration.
  2. H5P Content Types and support files
    H5P content types and the files to support them are installed into a H5P Library manager provided by the H5P module. An H5P archive file of the content types and other support libraries is downloaded from the H5P site and uploaded into the Drupal H5P library.



Screenshot of H5P Content Library in Drupal

Content Creation

The Drupal integration contains a H5P node type named, “Interactive content”. When a H5P node is created there is a selector for the H5P content type. For example, quiz, presentation, or dialog card. When a H5P content type is selected the editor for the content type is loaded interactively. The author then creates the desired content.



Screenshot of H5P content type selector

 



Screenshot of editor for H5P Flashcards content type

Once saved the content is presented when the node is viewed.

H5P Development

The H5P project provides a Drupal development environment (including a developer mode), online documentation, and a forum.

The various specifications and basics for getting started are well documented, and include a “hello world” example. H5P at its heart is JavaScript with a PHP wrapper for integrating with a website. Someone's ability to learn the framework will depend on their comfort with JavaScript.

Coding the editor component that creates and edits the content type typically requires as much work as coding the display for the content type. Custom editor widgets can be written. Existing H5P editor widgets can also be used though they are not documented.

The H5P Drupal hooks provide a clean method of adding CSS stylesheets and modifying H5P behaviour without modifying the base H5P code. Some tasks are complicated by the asynchronous nature of JavaScript loading and the use of iFrames.

Pros and Cons

H5P is continuously changing and improving. These pros and cons are a snapshot of my experience as of May 2017.

Pros:

  • Plug and play interactive content
  • Easy to share content
  • Option for turning off content sharing feature
  • Large variety of content types
  • Open source
  • The H5P team is approachable
  • There is a good content development environment for Drupal
  • The content creators I worked with were able to quickly and easily generate content using H5P
  • Drupal hooks available

Cons:

  • Documentation on some content types is lacking
  • Trial and error is often required to figure out options for some of the sophisticated content types
  • Though some of the content type editors are excellent, some are obtuse or confusing
  • There is no Drupal 8 version
  • Content creators are endlessly creative, you will have to deal with content types being “close but not exactly what I want”
  • There isn't a lot of good guidance on developer workflow if you want to contribute back to the project
  • Staff focused on content creation will probably have favoured tools -- for example, Articulate Storyline -- and push back on an unknown tool such as H5P
Further Reading Tagged:
Categories: Blogs

Evolving Web: Drupal 8 Modules You Can't Live Without

May 8, 2017 - 10:29am

Drupal 8 does way more out-of-the-box than previous versions of Drupal. If you're migrating your site from Drupal 6 or Drupal 7, you'll be amazed how many contributed modules you can now do without. 

That being said, there are still a set of handy contrib modules you'll probably use for most of your projects. This isn't a complete list, just a starting point for anyone new to Drupal 8 looking for a useful set of modules to try. 

Admin Toolbar

The Admin Toolbar gives you a dropdown menu to access the sub-items in the toolbar quickly. This is probably the first module to add to your Drupal 8 site.

Pathauto

Pathauto is the go-to module for automatically generating nice aliases for all your URLs. You get to define the path patterns for any content on your site that has a path (nodes, users, taxonomy terms...) Works in multiple languages. You need to add Token and Chaos Tools as dependencies.

Redirect

Along with the Pathauto module, most websites benefit from using Redirect to take users to pages if and when the paths change. 

Paragraphs

The Paragraphs module is a favourite for site builders who want to be able to create flexible content types that use compound fields. Want to add a set of calls to action to a landing page? Or mix together some videos, marketing text, and linked images? Or perhaps you need to add a set of time-slots to an event, or a set of editions to a book? Paragraphs to the rescue. Suddenly adding chunks of content within your content is really easy. You need to add Entity Reference Revisions as a dependency.

Honeypot

If your website has spam, Honeypot is an easy solution that might just fix your spamming issues. It inserts an invisible form element that catches bots that will unknowingly fill it in. 

Add to Any

Add to any is one of a number of options for adding social media links to your content. 

Metatag

Not just for your basic page title and description. Metatag makes sure that your content is going to look good when you share it on Facebook and Twitter too.

Menu Trail by Path

Use Menu Trail by Path to set the active menu trail for your content based on the URL. For example, when you're looking at a blog post, and you want the blog post menu item to be active.

Entity Browser

One of the questions I get most often when I show off Drupal 8's shiny new content authoring features is how to re-use images or files across different pieces of content. Start with the Entity Browser module (entity is a fancy Drupal word for content and this case usually refers to images, videos, and files). You'll want to try this out with the File Entity Browser module. Configure it using the 'Manage Form Display' settings. (Hint: make sure you have all the required libraries installed to get this working.)

Block Visibility Groups

Block Visibility Groups allows you to control which blocks are displayed on certain types of pages. For example, you can create a set of blocks that will show up on the homepage, and a different set of blocks on the contact page.

Diff

Drupal allows you to track the revisions (or versions) of content each time a user makes an update. The Diff module is a tiny module that allows you to see what's changed. 

Contact storage

If you decide to use the core Contact module, you might notice that contact form submissions get emailed and not saved in the admin UI of the site. If that's something you need, try out the Contact Storage module. For fancier forms, check out Webform.

If you liked this blog post and want some guidance on how to use these modules, we have Drupal trainings coming up online and in-person that you might like.

This is the fun part. Now you get to comment and tell me the essential modules I missed. I promise to try them all and do a follow-up blog post with the highlights.

+ more awesome articles by Evolving Web
Categories: Blogs

Web Omelette: Loading taxonomy terms in a tree in Drupal 8

May 8, 2017 - 3:59am

One of the great things about the taxonomy terms in Drupal has always been their hierarchical readiness. That is, how they can easily be organised in a parent-child relationship via a simple drag-and-drop interface. This feature becomes even more important in Drupal 8 where creating entities for anything you want has become easy so we no longer have to (ab)use taxonomy term entities for everything. Unless, of course, we need this kind of behaviour.

However, I recently noticed a shortcoming in the way we are able to load taxonomy terms programatically. I needed to load a tree of terms as represented by their hierarchy. But there is no API (for the moment) that allows me to do so. At least none that I could find.

What I needed was similar to the menu system where if you load a tree, you get an array of MenuLinkTreeElement objects that wrap the links and which can each contain an array of MenuLinkTreeElement objects that represent the children (subtree) of that link. So one big multidimensional array of objects.

For terms, I imaged there would be something similar, but I was wrong. The Drupal\taxonomy\TermStorage::loadTree() method basically does half the job I want. It returns all the terms in the vocabulary, each represented as a stdClass object (why?) that contains some basic info. And in this object we get a parents array that contains the IDs of the terms which are its parents (as you know, terms can have multiple parents).

This may be enough in certain cases. However, I wanted to go one step further. So I created a simple service that loads the tree of a vocabulary in a multidimensional array, similar to what the menu system does:

<?php namespace Drupal\taxonomy_tree; use Drupal\Core\Entity\EntityTypeManager; /** * Loads taxonomy terms in a tree */ class TaxonomyTermTree { /** * @var \Drupal\Core\Entity\EntityTypeManager */ protected $entityTypeManager; /** * TaxonomyTermTree constructor. * * @param \Drupal\Core\Entity\EntityTypeManager $entityTypeManager */ public function __construct(EntityTypeManager $entityTypeManager) { $this->entityTypeManager = $entityTypeManager; } /** * Loads the tree of a vocabulary. * * @param string $vocabulary * Machine name * * @return array */ public function load($vocabulary) { $terms = $this->entityTypeManager->getStorage('taxonomy_term')->loadTree($vocabulary); $tree = []; foreach ($terms as $tree_object) { $this->buildTree($tree, $tree_object, $vocabulary); } return $tree; } /** * Populates a tree array given a taxonomy term tree object. * * @param $tree * @param $object * @param $vocabulary */ protected function buildTree(&$tree, $object, $vocabulary) { if ($object->depth != 0) { return; } $tree[$object->tid] = $object; $tree[$object->tid]->children = []; $object_children = &$tree[$object->tid]->children; $children = $this->entityTypeManager->getStorage('taxonomy_term')->loadChildren($object->tid); if (!$children) { return; } $child_tree_objects = $this->entityTypeManager->getStorage('taxonomy_term')->loadTree($vocabulary, $object->tid); foreach ($children as $child) { foreach ($child_tree_objects as $child_tree_object) { if ($child_tree_object->tid == $child->id()) { $this->buildTree($object_children, $child_tree_object, $vocabulary); } } } } }

No need to copy it from here, you can find it in this repository inside a simple module called taxonomy_tree.

So what I basically do here is load the default tree and iterate through all the objects. If any of them are not already at the bottom of the tree, I populate a children key with their children. This happens by using the TermStorage::loadChildren() method and recursing through that list as well.

So let me know what you think and if this helps you out.

Categories: Blogs

Promet Source: How to use SOAP to communicate with the ICC API

May 5, 2017 - 6:02pm
Promet Source recently worked with a national media provider to develop a website from the ground up that was both a highly optimized web property for converting prospective subscribers and a customer portal that would deliver a flawless user experience for existing customers, customer service representatives (CSRs) and retailers. In this post, we’ll walk through how the Promet Source development team used SOAP (Simple Object Access Protocol) to communicate with the ICC API built for this project. Client Backend Documentation
Categories: Blogs

Chocolate Lily: Authoritarian structure in Drupal: a case study

May 5, 2017 - 4:37pm

In the current context of a Drupal leadership crisis and debate about project governance, it's important to reflect on ways the dictatorship structure has shaped and continues to shape the culture of the project. In this vein, response to a 2007 post by Drupal contributor Gus Geraghty makes for a fascinating, if disturbing, case study.

I recommend reading (or rereading) that thread before continuing here. I've deliberately chosen an example from earlier days to emphasize how tensions in the project, and patterns of response, have persisted and shaped the project at key junctures. I also hope that some distance may help to set those events in a reflective light, where the focus is not on who did what but on what we can learn about the overall organizational culture.

Those who raise critical questions are making a valuable contribution. Particularly in an authoritarian structure, speaking up is risky.

In his post and followup comments, Geraghty directly questioned the dictatorship power structure of Drupal, focusing on the then-new commercial interests of Drupal founder and dictator for life, Dries Buytaert, and his company, Acquia. Geraghty proposed a concrete alternative: reorganizing the project along cooperative lines. In follow-up comments, he pointed to the Linux Foundation as a possible model, structured to ensure no one company could attain dominance in the software project:

it fosters the growth of Linux by focusing on protection, standardisation and providing a neutral forum for collaboration and promotion. It also sponsors the work of Linus Torvalds, as opposed to a commercial interest paying Linus.

The response was immediate, pointed, and overwhelming.

Categories: Blogs

Drupal Association blog: The Process for Evolving Community Governance

May 5, 2017 - 3:59pm
Discover > Plan > Build > Iterate

There comes a time when we must all recognize that what got us here won't get us there. Now is that time for Drupal. The governance models that were put in place to support the needs of the community years ago are no longer working as well as they should. The Drupal community has reached a level of maturity that requires greater clarity, integrity, and resilience.

An effort is underway to evolve Drupal’s community governance. The Drupal community is in the driver’s seat. The Drupal Association is helping navigate and get the community where it wants to go by providing the structure, support, and resources that are desperately needed to make progress. I, Whitney Hess, have been engaged to be a neutral facilitator of this process.

We are proposing a multi-phase approach to redesign Drupal’s community governance models, management, and decision-making practices: Discover > Plan > Build > Iterate. In this first phase, our goal is to gain a deeper understanding of the needs of the Drupal community. We are conducting this research through a variety of methods: one-on-one interviews with select individuals; mediated group discussions; surveys and feedback forms.

We held seven hour-long Community Discussions over three days of DrupalCon. There were 6-10 participants per session. Though every session had its own energy and topics varied, all discussions were fruitful and impactful. Many participants said they left feeling better than when they arrived.

While there was some discussion about recent events in the sessions, the focus quickly shifted to brainstorming ideas for how to improve Drupal’s community governance. As mediator, it is my role to help people articulate their needs, and to support the community in devising strategies to better get those needs met. Please read the meeting summaries if you would like to get a sense of what was discussed.

There are currently seven online sessions scheduled over the next two weeks at a variety of times for the global community to participate in these facilitated discussions, and more will be scheduled if needed. If you want your voice heard, I strongly encourage you to join us. If you have questions or concerns about the sessions, you’re welcome to contact me directly at whitney@whitneyhess.com.

Once these sessions are completed, we will be conducting a short survey and other types of feedback forms to have the widest possible reach. We want to ensure that people have a variety of ways to constructively contribute to making Drupal the best it can be. We expect to launch these in late-May.

At the conclusion of the Discovery phase, we will move into Planning. We are at the earliest stages of conceiving a Governance Summit over 1-2 days in June to take all of the learnings from Discovery, and craft a strategy for specifically how to change Drupal’s community management and governance. As of today, we do not yet have dates, location, or participant information. We are waiting to see what comes out of Discovery before we devise any framework for how this can be achieved effectively and equitably. Again, the Drupal Association’s role here is to be a support, and to create space for the community to decide how it wants its governance to change.

I have very clearly heard a need for greater transparency into this process and how decisions are being made. I take that responsibility seriously, and will continue to share our progress along the way. Next up, please look out for a summary of our Discovery findings, to be shared in late-May/early-June.

With gratitude,

Whitney

Categories: Blogs

Hook 42: Baltimore DrupalCon - Favorites From Charm City

May 5, 2017 - 3:50pm

Every year DrupalCon brings the community together. This year we were fortunate enough to have eleven of our team come together in Baltimore! We had a ton of fun while Drupaling and want to share a few of our favorite moments!

Categories: Blogs

Palantir: Competitive Analysis: The Key to a Woman’s Healthy Heart - Part 2

May 5, 2017 - 2:23pm
Competitive Analysis: The Key to a Woman’s Healthy Heart - Part 2 brandt Fri, 05/05/2017 - 12:23 Michelle Jackson May 5, 2017

Competitive user testing can validate the conclusions that are made from a preliminary competitive analysis.

In this post we will cover...
  • How health systems can conduct competitive usability testing

  • How navigation organization and prioritization impacts the ability of people to find information on specific health topics such as heart disease and its impact on women’s health

  • How competitive analysis can help health systems improve information architecture to better serve people suffering from critical illnesses

  • How looking at peer competitors can help health systems better serve the needs of patients and their caregivers

We want to make your project a success.

Let's Chat.

Heart disease is the No. 1 cause of death for women in the United States. We are homing in on on DrupalCon-host city Baltimore, which has launched several initiatives to combat cardiovascular disease. Johns Hopkins Medicine and the University of Maryland Medical Center are two large university hospitals local to Baltimore that have centers dedicated to women and heart disease. Using women’s heart health as our focus, we compared select search outcomes, menu hierarchy, labeling, and landing pages.

Step 1 was a cursory competitive analysis of two health system websites that we covered in part 1. Step 2 is competitive user testing to validate the conclusions that we made from the preliminary competitive analysis.

Competitive user testing is a useful way to see how your site measures up against your competitors’ sites. By taking a look at how patients may interact with your site and competitor sites, you can compare their experience and make changes that allow you to better serve patients’ specific needs. You can implement competitive usability testing even if you have not completed a preliminary competitive analysis.

Since we last discussed websites and women’s heart health, we held two user tests to compare site visitors’ experiences when navigating the Hopkins Medicine and University of Maryland Medical Center (UMMC) health system websites.

We chose tasks based on actions that people might perform on a health system website. We also considered the type of information women look for when seeking information on heart disease.

For our user tests, we asked two women who are Maryland residents between the ages of 40 and 70 to complete several tasks on both the Hopkins Medicine and University of Maryland Medical Center websites without using search.

Tasks we asked participants to perform:

  1. Learn if you are at risk for heart disease
  2. Share information about the risks of heart disease
  3. Find a physician
  4. Schedule an appointment
  5. Find directions
  6. Pay a bill
  7. Find a program or center related to women and heart health
Select Findings

We found:

  • “Health” and “Healthy Heart” labels provide users with a quick pathway to heart health risk information
  • Users successfully completed the majority of top tasks (i.e. schedule an appointment)
  • Finding programs and information about risks associated with women’s heart health is challenging
  • Multi-level navigations and redundant label terminology created complex pathways for users

The experiences of these two women revealed some challenges that might be experienced by other site visitors. Our findings warrant additional usability testing to further evaluate, compare how these and other health system websites help patients seeking information about programs and centers that address women’s heart health.

“I would never [pay my bill] this way, I would pay it online [through my bank].”

Highlights and Challenges

“Health” and “Healthy Heart” labels provide users with a quick pathway to heart health risk information.

On the Hopkins Medicine’s health system website, both the first and second participants successfully found “know your risks” on the Healthy Heart landing page (see Figure 2) when looking for information about heart disease risk. The second participant said Hopkins Medicine performed better than its UMMC counterpart in describing the factors that put people at risk for heart disease.

When navigating the UMMC website, the second participant navigated to the Women’s Heart Health Program landing page and said the description of risks were symptoms associated with heart disease and not actual risks. “They don’t say, blood pressure, overweight, sleep apnea,” she commented. The first user did not locate risks for heart disease on the UMMC website.

Figure 1: Hopkins Medicine Healthy Heart navigation menu

Hopkins Medicine’s main and secondary navigation labels (Health > Healthy Heart) gave users quick access to information on heart disease risk.

Figure 2: Hopkins Medicine Healthy Heart landing page

“Know Your Risks” featured prominently within Healthy Heart local navigation makes heart disease risks easily accessible to users.

Figure 3: UMMC Women’s Heart Health Program landing page

UMMC’s women’s heart program landing page does not present information about heart disease risks that is easily accessible.

Participants successfully completed the majority of top tasks (i.e. schedule an appointment). 

Both users successfully completed the majority of top tasks such as find a physician, schedule an appointment, find directions, and pay a bill. The first participant did not find a way to get physician on the UMMC homepage and Heart & Vascular Center landing page. This may have been because of the placement of the calls to action in the sidebar (Figure 5), adjacent competing content (Figure 5) and the utility navigation and the “Find a Doctor” call to action button similarity in color (Figure 4).

“If I wanted to find a physician I woul[d] ...call a heart specialist first.”

Figure 4: UMMC main navigation and “Find a Doctor” call to action button

Calls to action for top tasks such as “Make an appointment” and “Find a doctor” blend in with utility navigation colors, which could make it hard for users to see these key buttons

Figure 5: UMMC Heart and Vascular Center landing page

Calls to action for top tasks such as “Make an appointment”  and “Find a Doctor” compete with UMMC Cardiologists video and hero news story “One Family, Two Heart Transplants” content.

Finding programs and information about risks associated with women’s heart health is challenging.

The first participant did not find either the Hopkins Medicine or UMMC’s programs nor centers related to women and heart health, even when visiting pages that were dedicated to cardiology or cardiovascular health. The first participant visited UMMC’s programs listing page, which contained separate links to women’s health and heart and vascular health pages, but did not list the Women’s Heart Health Program under either of these headers (see Figures 6 and 7).

After the first participant clicked on the Heart and Vascular Center landing page, she scanned Services, but did not find the Women’s Heart Health Program because the secondary navigation extends below the top half of the page masking the Women’s Heart Health Program. The first participant also was unsuccessful in finding a program or center related to women and heart health information on the Hopkins Medicine health system website. “I can find heart stuff, I just can’t find anything on women,” she said.

The second participant was able to find the Women’s Heart Health Program on the UMMC site; however, she remarked that it was challenging to locate: “It’s not intuitive how you would find a program here. Now I see it, but not before I’ve gone through too many exercises.” 

Figure 6: UMMC programs landing page

Women’s Health section on programs landing page has no mention of the women’s heart health program

Figure 7: UMMC programs landing page

Heart and Vascular Center section on programs landing page has no mention of the women’s heart health program

Figure 8: UMMC Heart & Vascular Center landing page

The local navigation for the Heart and Vascular Center landing page has items under “Services” that extend beyond the top of the page. One user stopped at Pulmonary Hypertension and missed the last item in the “Services” dropdown, “Women’s Heart Health.”

Multi-level navigations and redundant label terminology created complex pathways for users.

Participants had difficulty remembering how they got to specific pages because of redundant label terminology and deeply nested pages. When looking for information on risks for heart disease, the first participant had trouble using the main, secondary and local navigation to find this information, selecting Health information > Medical encyclopedia > Look up a symptom > Medical Encyclopedia > Your Health only to go back to the main navigation to click on Centers and Services and Patients and Visitors.

“I would always call because half the time it gets lost when you do it through the portal.”

Takeaways

We can gain a more nuanced understanding of how diverse patient demographics navigate and use health system websites by conducting competitive usability tests and focusing on a specialized medical issue such as women’s heart health.

While this study warrants additional research and usability testing given the small number of users, it does reveal challenges that are faced by many of our clients in the health care industry like navigation and findability.

To better serve patients and their caregivers, health system websites can take several steps to improve the experience for site visitors:

  • Simplify their menu structure so that there are fewer levels and sub-navigations
  • Remove competing content from areas of the page and improve color contrast, helping users access key buttons like “make an appointment” and “find a doctor”
  • Reference and cross link to women’s heart health centers and programs on program pages
  • Provide related content on pages that have resources on heart health and women’s health to improve findability
  • Revisit alphabetizing navigation items in favor of featuring top specialities and health areas that are of primary importance to patient audiences.

In the health care field, meeting the needs of patients can be a matter of life and death. It is important for health systems to continually evaluate how their websites are meeting the needs of patient and caregivers and how they can improve the experience for patients and caregivers and best facilitate access to information about health services and resources.

We want to make your project a success.

Let's Chat.
Categories: Blogs

Pages