Drupal Planet

Subscribe to Drupal Planet feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 7 hours 53 min ago

Vardot: How to Rank #1 on Google Using Drupal CMS?

August 9, 2018 - 18:47
Ahmed Jarrar August 9, 2018

SEO (Search Engine Optimization) is a hot topic when it comes to the discussion of online marketing. In today’s Internet climate, most people usually find websites by typing in a few keywords into their search engine of choice, like Google. Folks trust Google to present only the most relevant sites to their search queries, so it’s only natural the first couple of sites that get thrown up are the ones that get the most clicks and visitors.

As a consequence, those top page rankings are viewed as a prime real estate by those who want to boost traffic to their websites. After all, hardly anyone would bother checking the sixteenth results page when searching for a word or phrase. This is where SEO comes in.

 

What Is SEO? 

 

SEO is the practice of optimizing a website to rank high among search engines. It’s a set of rules for organizing, populating, and presenting your website in such a way that improves its search rankings and puts businesses in a better position to earn.

Perhaps intentionally, having a well-optimized website not only makes search engines happy, it also makes visitors happy. Well-optimized websites are fast and easy to navigate, leaving your visitors with a positive impression and making them want to stick around your site a little more.

If you want to drive all that traffic to your website, it’s important to have the basics of SEO down pat, and have a grasp on what good and bad SEO practices are. It’s also important to have a good engine underneath the hood of your website, and Drupal just might be the CMS for you if that’s the case.

At its core, Drupal was built with SEO in mind. It has the power, flexibility, and tools needed to optimize every facet of your website for search engines, and in its huge kit of modules, there are quite a few that are dedicated to giving you an easier time when it comes to improving the optimization of your website. You really can’t go wrong with Drupal if you have your web page's search ranking in mind, and here are just a couple of things you can do with Drupal to improve your SEO and shoot to the top of Google search results.

 

Implementing Metatags

 

Meta tags are bits of text that are integral when it comes to improving your website’s search ranking, because, in a way, it tells search engines what the content is on each page on your website. This could be the titles of your pages to the little descriptions you see underneath the website links on a Google results page. You and your search engine need these bits of information to properly present and index your site on the search results page.

Usually, you can leave it up to your search engine to generate your page’s metadata for you, but by using the Drupal Metatag module, you can customize the metadata yourself. Set your own information such as page titles and descriptions to more properly and correctly present your site to your search engine and the online world.

 

Cleaning up Your URLs

 

Having bad, messy-looking links is a no-no when it comes to SEO. You want links that are easy to read and not just a jumble of letters and numbers so that they look more attractive to prospective visitors and to your search engine, who may be looking at your URL for keywords when it determines your site’s ranking.

Many web developers never realize the implications of messy URLs and leave their link syntax as-is, but going through each and every page on your website and manually setting the URLs isn’t an attractive option either. Luckily, Drupal generates clean URLs by default, improving the readability of your links and making things a bit easier on you.

If you want your links to be better and even more easy on the eyes, popular Drupal module Pathauto is a configurable system that automatically creates clean and extremely readable links that are perfect for your site’s optimization.

Another thing to keep in mind is making sure that your links actually go somewhere. Nothing sours the user experience more than clicking a link and being presented with a 404 page, and this in turn negatively affects your search rankings.

You can avoid this from happening by using the Redirect module. If you happened to have changed the page’s URL after Google indexed it, or moved the content to a different URL, this module allows you to make 301 redirects from that old link to the new one, quickly and painlessly, without having to go through the headache of cleaning up after yourself and fixing broken links.

 

Improving Page Speed

 

Google has been using the speed your page loads as an influencing factor in search rankings for years at this point. As they point out, sites that load faster have users that stay on for much longer, so it’s not only Google that you’re pleased by speeding up your website.

You might have to spend a little to have your website up to speed, but Drupal comes with several measures to help pages load faster, such as using BigPipe.

However, it’s not only desktop users you have to keep in mind, but mobile users, too. Given the leaps and bounds that technology has undergone in the last couple of years, you now find more and more people browsing the web on their smartphones and tablets. It’s important to make sure that your site experience is just as friendly and accessible on mobile devices as it is on desktop computers. As anyone who has used a desktop site on a mobile device knows, it’s not a pleasant experience.

Drupal’s default theme is responsive by design, which means it will display well on mobile screens of any size without having to do complicated rewrites of code or having to juggle multiple URLs to make sure your site displays correctly. With Google now also looking at the page speed of mobile sites, it’s now more important than ever to focus on delivering a good, well-optimized mobile experience to improve your SEO.

 

Read more: SEO Checklist Before Launching Your Drupal Website

 

Talking to Your Search Engine

 

Optimizing your website can be a little tough when you don’t even know basic things such as where your site traffic is coming from. Installing modules like Google Analytics makes you privy to such information, and for someone with their finger on the pulse of the site’s SEO, it’s perhaps one of the most important tools they can have.

With Google Analytics, you get to know things about your site visitors: Where in the world they come from, which links they followed to get to your site, which pages they visit and how much time they spend on those pages, what keywords they searched to find your page and more. If you’re concerned about SEO, then getting information about your website directly from Google, the most popular search engine in the world is valuable information to have, and can help you make decisions on what to improve on next.

And while you’re pulling information from Google about your website, you can also provide information about your website to Google in the form of an XML sitemap. These are specially formatted, condensed summaries of the pages of content on your website that you can submit to Google to help them find your site and let their bots crawl through your pages. Google can crawl through your site without an XML sitemap, but you take on the risk of them possibly missing pages.

With Drupal, generating an XML sitemap is as easy as installing the XML sitemap module which creates one for you, and modules like Cron can automatically make sure your sitemap is kept up-to-date with the latest information from your website.

 

Conclusion

These are only just a couple of the things you can do with Drupal to improve your SEO. Like all things worth doing, you can’t just press a button and magically have a well-made website ready to dominate the first page of the Google search rankings. It takes a good grasp of the basics, as well as a little effort, to have a perfectly optimized web page.

However, the road to that coveted #1 spot on the search results pages become a lot less bumpy thanks to the ease that Drupal gives you when it comes to optimizing your site.

Want to boost your site’s traffic and rank #1 on Google with Drupal? Message us through our Contact Us page, or via email at sales@vardot.com.

OpenSense Labs: From Conception to Reality: Drupal for Futuristic Websites

August 9, 2018 - 17:45
From Conception to Reality: Drupal for Futuristic Websites Shankar Thu, 08/09/2018 - 16:15

“Great Scott!”, exclaims the scientist in the renowned science fiction trilogy ‘Back to the Future’ which hit the cinema screens in 1985.  This exclamation by the scientist, who travels 30 years into the future in his flying car, is suggestive of remarkable inventions by the homo sapiens which is a colossal conundrum to solve in the present world set in this motion picture.

 

Drupal has been revolutionising the web application development with its flexibility in integrating with futuristic technologies

Since the first ever website, invented by British Scientist Tim Berners-Lee, went live in 1990, the tech enthusiasts have been looking at the enormous technological advancements in the years that followed. In a similar fashion, Drupal has been revolutionising the web application development with its flexibility in integrating with futuristic technologies.

What is Drupal doing with the futuristic technologies to give you the “Great Scott!” moment?

Futuristic Technologies: Strategic Trends with Broad Industry Impact

As stated by Gartner, an intertwining of people, devices, content, and services is known as an intelligent digital mesh. Digital firms are supported by enabling digital models, business platforms, and a rich, intelligent collection of services.

Intelligent: With AI seeping into virtually every other technology, well-defined focus can allow more dynamic autonomous systems.
Digital: Amalgamating the virtual and real worlds in order to create an immersive environment.
Mesh: The connections between a growing set of people, business, services, devices, and content for delivering digital outcomes.

Top 10 Strategic Technology Trends for 2018  by Gartner Intelligent digital mesh is the intertwining of people, devices, content and services

Immersive experiences, digital twins, Artificial Intelligence, conversational platforms, Blockchain, continuous adaptive security among others, as depicted in the illustration above, form a foundation for the next generation of digital business models and ecosystems.

Beyond Websites: Integration of Future Technologies and Drupal

Drupal is among the front-runners when it comes to content-heavy websites. With new discoveries happening outside of Drupal, it is only better to leverage the benefit of Drupal’s flexibility to incorporate next-generation technologies.

Artificial Intelligence

An area of computer science which stresses on the creation of intelligent machines that can work and act like humans, Artificial Intelligence has been the talk of the town ever since it burst onto the scene. How can it be leveraged for your Drupal websites?

Chatbots

There have been several phases in the way humans interacted with the computers. First was the Terminal Interface which involved the use of command line or DOS prompt. The second phase was the Graphical Interface which used visual representations of programs, files, and actions. The third wave is the Conversational Interface which allows users to interact with the computers using a natural language.

Chatbots, powered by artificial intelligence technologies, are wonderful for your website as they offer conversational UI and can hugely benefit your enterprise. Drupal offers a useful set of modules that can help in the integration of chatbots in the website thereby providing a conversational interface to the users.

Facebook Messenger Bot module, created by The White House, gives you the tool for developing chatbot on a Facebook Messenger Platform.

Chatbot API, another Drupal module, can be used to incorporate a bot in the site. It is an additional layer that falls between your Drupal installation, your Natural Language Processing (NLP) and your various chatbots and personal assistants. It can work with systems like Dialogflow, Alexa, Cisco Spark Microsoft and Twilio.

Digital voice assistants

While chatbots are primarily a text-based medium, digital voice assistants can be more human-like with the ability to talk like a human. For instance, Google Duplex, the latest entrant in the scene of digital voice assistants, can provide lifelike conversation thereby having a human-like chat and booking hotels on your behalf.

Alexa, an integration module for Amazon Echo services, allows Drupal to respond to Alexa Skills Kit requests. The demonstration given below shows that by indulging in a casual interaction with Alexa, the shopper is able to preheat the oven, add the ingredients and cook the food without even looking at the phone or laptop.

When the shopper provides a verbal query, this input is converted into a text-based request which is then sent to the Freshland Market Drupal 8 website (a fictional grocery store). From there, a coupling of custom code and Alexa module respond to the Amazon Echo with the requested information.

Cognitive Search

Forrester, research, and advisory firm, defines cognitive search and knowledge discovery as “the new generation of enterprise search solutions that employ Artificial Intelligence (AI) technologies such as Natural Language Processing (NLP) and Machine Learning to ingest, understand, organize, and query digital content from multiple data sources”.

Azure Cognitive Services API module allows Drupal to leverage the benefits of Microsoft Azure Cognitive Services. It helps in exposing machine learning APIs and allows developers to incorporate intelligent features like detecting emotion and video, understanding speech and language, and recognising face, speech, and vision.

Augmented Reality

Gartner defines Augmented Reality as the real-time use of information in the form of text, graphics, audio and other virtual enhancements integrated with real-world objects. It is this “real world” element that differentiates AR from virtual reality. AR integrates and adds value to the user’s interaction with the real world, versus a simulation.

A Drupal agency developed a chatbot prototype which helped customers to choose recipes based on the health constraints and their preferences. Chatbot provided an interactive experience to the users which helped in avoiding intensive research for the grocery shopping. By integrating AR in Drupal, they tried to take it one notch higher.

The demo in the video displays a shopper interacting with the AR application. Freshland Market’s mobile application (fictional grocery store), which is built on Drupal 8, guides the shopper to make better decisions while shopping through AR overlays.

It superimposed relevant data like product ratings, price and recommendations over the product items detected by the smartphone camera. By showing the products that are best for her diet plan, the mobile application personalised the shopper’s experience.

Drupal’s web services support and JSON API module assisted in providing content to the mobile application. The Drupal 8 site of Freshland Market stored all the product-related information. So, if the Drupal content for any of the product items is edited to display the item being on sale, it automatically reflected in the content that is superimposed through the mobile application. Furthermore, the location of the product was stored on the site which guided the shopper to the product’s location in the store.

Another use case is the Lift HoloDeck prototype which was developed using commercially available technologies - Drupal (content store), Acquia Lift (web personalisation service), Vuforia (AR library) and Unity (3D game engine).

Lift HoloDeck team developed a mobile application that superimposes product data and smart notifications over physical objects that are detected on the smartphone’s screen.

Consider a situation where a user informs about his purchases to a coffee shop through his mobile application. Entering a shop, he would show his phone screen displaying “deal of the day”. The application superimposes diet plan, directions on how to order, and product data on top of the beverage. By glancing at the nutritional information, he would order his preferred choice and would get a notification stating that his order is ready to picked up.

Virtual Reality

Virtual reality is basically computer-generated environments or realities that can be used to simulate a physical world in a specific environment to make it feel real.

Virtual reality can be used to build cross-channel experiences. The demonstration shown in the video below features a student who is eager to explore more about Massachusetts State University (a fictional university). The video depicts that he is able to take a virtual tour directly from the university’s website sitting on his sofa.

Placing his phone in a VR headset, he can go around the university campus, explore buildings, and look at the program resources, photos and videos within the context of the virtual tour.

The Massachusetts State University’s Drupal site stores all of the content and media that is featured in the virtual tour. Drupal backend helps website administrators to upload media and position hotspots directly. Using JSON API, the React frontend pulls in information from Drupal.

Internet of Things

In the broadest sense, the term Internet of Things (IoT) subsumes everything that is connected to the internet but it is increasingly being used to define objects that can talk to each other. The IoT can be anything ranging from simple sensors to smartphones and wearables connected together. Combining these connected devices with automated systems can help in gathering information, review it and create an action plan to assist someone with a particular task or learn from a process.

DrupalCon New Orleans 2016 had a session which delved around bringing Drupal and internet of things together. It exhibited a demonstration that used a barometric pressure sensing, GPS-enabled wearable armband connected to the internet which could, then, display an icon to provide the weather forecast of the current location.

The armband, which was tethered to iPhone, sent latitudinal and longitudinal data to a ThingSpeak channel (an API for the IoT) using mobile data. It, in turn, tracked the location of a ship by sending this data over HTTP to the Drupal 8 website. When the site received this authenticated POST data, new location nodes were created. It updated the map and table that is built with Views and changed a block on the sidebar to display the matching icon of weather in the current location of the ship.

Blockchain

Don & Alex Tapscott, authors of Blockchain Revolution (2016) define the blockchain as “an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.”

Chainfrog has been working on blockchain technology since its early stages and devised a use case where the user data including communication history, address and the profile data will be available to everyone in a large organisation.

“Blockchain is an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.”

To synchronise and securely segregate Drupal user data, big companies map user registration forms with a centralised company API. These big companies usually have more than one Drupal installations for different departments within the enterprise with a central codebase.

Thus, instead of having an expensive API layer, Chainfrog proposed using Drupal 8 custom module via a distributed ledger for the synchronisation of user data. So, instead of making an HTTP call everytime a new user is added, they planned to adopt a peer-to-peer immutable ledger thereby removing the middleman, in this case, API.

Blockbinder, a product of Chainfrog that helps in connecting existing databases instantly, would keep a tab on user records in a data table whenever a new user is added to the Drupal system. Within 30 seconds of any new addition in one of the Drupal installations, all the installations would synchronise the user data.

At DrupalCon Nashville 2018, we propounded possible use cases for the coming together of Drupal and Interplanetary File System (IPFS). IPFS, which has its working principle based on blockchain technology, is a peer-to-peer hypermedia protocol. Centralisation is at the core of Drupal’s workflow in comparison to IPFS’ decentralised nature. We looked for ways to make them work together.

Drupal, which allows content editors to seamlessly produce content, can be entwined with IPFS for an amazing digital experience. From helping archivists to store superabundance of data to helping Service Providers as a global Content Delivery Network (CDN), there are tons of benefits. We presented a use case to make this a reality which can be explored in this video.

Conclusion

These are some of the frontiers that Drupal has crossed. There are lots of emerging technologies whose potential with Drupal can be explored like..?

Future is bright with lots of new inventions happening in the technological fields to make the world a better space. Digital firms can make significant strides in their online presence by integrating future technologies with their website. Drupal gives a great platform for the businesses to look beyond websites and incorporate emerging technologies to build futuristic websites

Want to know how we develop Drupal sites for our clients and also maintain and support them? Ping us at hello@opensenselabs.com to set the ball rolling and make your website exciting with the incorporation of emerging technologies.

blog banner blog image Drupal Futuristic website Future technology Futuristic technology Futuristic web app Artificial Intelligence Chatbot Digital voice assistant Conversational interface Conversational platform Cognitive search Augmented Reality Virtual reality Immersive experience Internet of things Blockchain Interplanetary File System IPFS AI AR VR Drupal module Drupal 8 Blog Type Articles Is it a good read ? On

AddWeb Solution: ReactJS And Drupal - The Competent Combo!

August 9, 2018 - 17:31

What is ReactJS?

Created to enhance speed, simplicity, and scalability, ReactJS has been doing wonders ever since its initial release in 2013. It was basically created by Jordan Walke, a software engineer at Facebook. And its first deployment on Facebook’s newsfeed turned out to be so successful that it was later on adopted by Instagram too. Amongst all the open-source platforms used today for front-end web-application development, ReactJS is the library which is reaching heights these days. And there are multiple reasons for such a booming popularity in such a short period of time. Each adding to the enhancement of current front-end UI scenario. No wonder it looks like, ReactJS is here to stay!

 

What is Drupal?

Drupal is the big name from the open-source community for web content management. Prominent names from journalism leaders like ‘The Economist’ to ‘The Royal Family’ of Britain. The security and scalability of Drupal are so high that it has made it the most trusted platform for web development.

Even after being a not-too-easy-to-use platform that requires technical expertise for building and maintaining it; Drupal has been chosen by the top-notch players from various industries like Harvard University, Tesla Motors, ABS-CBN News, Warner Bros. Records, et al. In addition, the Decoupled Drupal is the concept where Drupal can be used for building a strong back-end and opening up the doors for upscaling the front-end scenario even more by letting it being build with some other framework. And this is one of the best things ever happened to Drupal.
 

The Marriage of ReactJS & Drupal:

, ,

Now, ReactJS is gaining immense popularity and the marriage of ReactJS and Drupal has become the talk of the town. In fact Dries Buytaert, the founder of Drupal, has also shared he's planning on Drupal adopting React. Though the news has received mixed opinions from the experts of the open-source community, it’s definitely something to be given an ear to.

 

ReactJS and Drupal are supposedly called the marriage made in heaven. ReactJS is one of the most evolved front-end frameworks from the house of JavaScript. But like every other JavaScript framework, this one too needs and a service-based back-end for storing the data and working on the business logic. Drupal being the strongest platform with built-in RESTful services; it serves as an apt companion for this amazing front-end framework that ReactJS is.

 

The uprising need for using a modern framework with a robust back-end platform is also a reason behind the buzz created around the combination of ReactJS and Drupal. In fact, Dries has quoted in his blog that JavaScript wasn’t his first choice. Rather he considered Angular and Ember over JavaScript. But citing the ever-evolving market of JavaScript, he realized that the combination of Drupal and ReactJS was what the modern web needed the most.
 

The community behind ReactJS is also one of the major reasons for taking it into high consideration over other good frameworks. Also, whether it is about ornamenting the current page and few of its elements or it’s about creating a full-fledged single page app (SPA), this very combination is the sure-shot success. Lullabot shares some insightful information about the same, click here to know more.

 

Some hardcore Drupal Monolithic-ians have also publically discarded the official union of ReactJS and Drupal. But since the world is ever changing and the possibilities, endless; the acceptance of the same can be foreseen. Of course, it is a matter of conflict and controversy. But then, only time can tell how things turn and what reigns the open-source community!

 

I hope this blog helps you to expand your ReactJS and Drupal knowledge …Thanks! If you need more assistance regarding Drupal Development Services, feel free to contact us now.

Drupal Association blog: A more sustainable framework for fiscal sponsorship of Drupal camps

August 9, 2018 - 13:30

Camps are Drupal’s growth engine and they take place all over the world. They attract local developers, connect them with resources to learn how to use Drupal, and most importantly, they provide on-ramps into the community. We are incredibly thankful and amazed at the level of commitment and contribution that organizers invest in their events. This is a very important way to contribute back to the project.

The Drupal Association supports camps as we can. We provide grants to new events through Community Cultivation Grants (check out this GoaCamp story). We also provide fiscal sponsorship to camps. This means we let organizers deposit their camp income into the Drupal Association bank account, taking advantage of our non-profit  status. Then, they work with our operations team to pay bills out of the account.

It’s been an honor to help several camps this way. However, this program has two major challenges. 1) We are not able to support camps globally because we can’t work in every currency, so most of the camps we support are in the U.S. 2) As we became a smaller organization, we have fewer staff to support this program. We haven’t been as fast at processing funds as we would like or our camps need.

Knowing how important camps are to Drupal, how organizers need their work made easier, and that we need to provide global support, we decided that the best way to provide better fiscal sponsorship is by referring community groups to organizations whose business is set up to provide this service. Over the years, we have watched several organizations get very good at providing fiscal sponsorship to open source projects.

We therefore have been looking at best practice models across many open source communities and we are happy to partner with Open Collective, a company specializing in fiscal sponsorships and other open source funding opportunities. They have the ability to scale and offer the level of service to meet a camp’s needs. In the US, Open Collective Foundation has recently obtained their 501(c)(3) status, and will be able to sign for and represent your camp as we have done in the past. Their platform, itself an open source project just like Drupal, gives camp organizers full transparency, and on-demand reporting so they can manage a camp effectively.  Additional details about Open Collective can be found here.

Because of this opportunity, we have made the choice to sunset our internal program as of August 31, 2018.

While we have chosen to partner with Open Collective to assist in this transition, we strongly believe in choice and there are other fiscal sponsorship opportunities that you can choose to roll your funds to, such as Software In The Public Interest and the Software Freedom Conservancy.

We know that each camp is in a different stage of planning, and we are dedicated to making sure that the transition is smooth and will not affect the activities and success of camps. We will be reaching out to camp contacts to schedule time to talk through the transition. From there, we will roll the funds to a legal entity that you have chosen.

We are thankful for all the camps we were able to help get launched, and continue to watch their growth year after year. We hope this transition will help our camps grow and scale with no limitations.

Drupal core announcements: Core JavaScript will be formatted using Prettier (many patches will need rerolls)

August 9, 2018 - 09:32

Drupal 8.6.x is now in a beta phase, which means we will now undertake disruptive cleanup tasks like adjusting coding standards. The main standards change in this release cycle will be the adoption of Prettier code formatter.

Work is underway to patch core for this change, which will touch many files, so be aware that you will need to reroll patches for conflicts and adjust them to use the new code style rules set by Prettier by running yarn prettier.

Agiledrop.com Blog: AGILEDROP: How to Speed Up Drupal Websites with Google AMP

August 9, 2018 - 07:53
Google’s AMP is an open source project that stands for Accelerated Mobile Pages. As the name implies, this project aims to serve mobile pages instantly without long load times. You can get a feel for AMP by searching for anything on your mobile on Google and clicking on links with a lightning sign beside them. Speed is an important factor for any website as it has a number of benefits including making a massive impact on a site’s SEO. In this post, let’s take a look at AMP for Drupal 8 and a brief overview of its implementation on the CMS.   There’s a module for that! As always, there’s a… READ MORE

Drupal Association blog: 2018 Q1 & Q2 Financials Statement Summary

August 9, 2018 - 02:15

Our board of directors is responsible for the Drupal Association’s financial health and as part of their duty, they review and then vote to approve monthly financial statements. The board met virtually on July 25, 2018 and voted to approve the Q1 & Q2 2018 financial statements, which can be found here.

Each month we compare our results against the financial KPIs we have set with the advice of our virtual CFO, Summit CPA. These KPIs were set to help us focus on increasing our net income so we can build a stronger cash reserve to ensure the organization’s sustainability.  

Our 2018 Financial KPIs are:

  • Cash Reserve: have a cash balance of 15% of Total Revenue
  • Net Income Profit Margin: end 2018 with a net income profit of 4%
  • Increase our Non-Event Revenue to $1.6M
  • DrupalCon Profit Margin of 27%

As of our June financial statement, which was approved by the board, the organization is tracking well against these KPIs.


KPI analysis through June 30 is looking positive for money in the bank, net income, non-event revenue, and event profit margin.

You can see that April was lower than the ideal target, due to missing revenue in a couple of areas. One with DrupalCon Nashville, where ticket sales came in lower than expected, and the second was some hosting contracts coming in later. These contracts will be reflected in future months.

We will monitor all KPIs through the year to ensure we are on track. However, one KPI is now complete: Nashville profit margin. DrupalCon Nashville was forecasted to come in at a net profit of $445K at the close of the conference in April, 2018, or 22%. While training tickets under-performed, resulting in a lower than expected ticket revenue, we still exceeded our net profit goal due to a decrease in expenses and an increase in sponsorship revenue. The final net profit was $481K or 25% which is 2% under the set KPI.  


Details for the DrupalCon Nashville forecast and actual income

While we did exceed our net profit forecast, it should be noted that this event did not generate as much for the project as past DrupalCons. This is because Nashville’s cost per attendee was higher than usual due to the location. However, at the time of selecting the venue, it was the best option compared to the other available cities. The Drupal Association continues to seek ways to diversify revenue so we are not so reliant on one event to fund the project.


The overall trend shows Nashville coming in lower than recent DrupalCon North America net income margins

Drupalcon is evolving and we are making changes.  While the programming, speakers, sessions make up the core of DrupalCon, our event staff is retooling and creating more value to serve everyone in the Drupal ecosystem.

We would not be able to do our mission-driven work without the support and contributions of our community. Contributions come in many forms, through the purchase of DrupalCon tickets and event sponsorships, through our Supporters and Members, Drupal.org sponsors, recruiters who post jobs on Drupal Jobs and many other fantastic ways our community supports the Drupal ecosystem. We are deeply grateful for everyone who contributes time, talent, and treasure to move Drupal forward.

Thank you!

InternetDevels: Mail in Drupal 8: the built-in system and useful modules

August 8, 2018 - 21:00

The dream of many website owners is to have email sending opportunities on their websites. Of course, it’s possible with Drupal 8, because it has infinite powers.

Read more

TEN7 Blog's Drupal Posts: Episode 036: Matthew Tift

August 8, 2018 - 20:54
Dr. Matthew Tift, Senior Drupal Developer at Lullabot, musicologist, podcast host and educator, sits down with Ivan Stegic to discuss his fascinating career and passion for those things open source. Discussing: Matthew's midwest ties, Walking meetings, The advantage of working at home, Working with Wisconsin Public Radio, Sea Grant Non-Indigenous Species Project. Dogpile and Metacrawler, Automate that process, C#, ColdFusion, VB6 Discovering Drupal, TTBOOK (To The Best of Our Knowledge), Accessible public information, Teaching kids to code, Finch Robots, Tonka Coder Dojo, "The Open School House", Live coding, Algorithmic Music, Algoraves, Toplap.org, Syncthing.

Mediacurrent: Marketer’s Guide to Drupal 8: Healthcare Marketing Q&A

August 8, 2018 - 20:09

Alan Onnen is the Associate Director of Marketing for the Shirley Ryan AbilityLab. Recognized as #1 in rehabilitation for 27 years in a row. AbilityLab introduces its revolutionary care through 5 Innovation Centers - state-of-the-art hospital facilities and equipment for exceptional patient care provided by the best medical and nursing support.

With 15 years of experience in the marketing industry, the past 5 being with SRA and being a part of the team that helped adopt Drupal, Onnen has seen firsthand how Drupal 8 powers digital strategy. 

Mediacurrent Interview with Alan Onnen 

Mediacurrent: What does “digital transformation” mean for you? 

Alan Onnen: Digital transformation means a constant evolution. There’s no single transformation; it’s a constant state of change, staying on top of trends at once. As a digital marketer, you need to know a little bit about everything, UI, UX, nerdy stuff, best practices, changes in the digital environment, what people expect from websites in your vertical, etc. Some people think transformation is a binary term - something new - but it's not.

Mediacurrent: How does open source fit into the equation?

AO: Open source is something that’s not new but it’s getting so mainstream its part of that digital transformation. It’s about adjusting to the new worlds where open source doesn't mean unsecure - it means that it’s open and honest. We had to get buy-in from stakeholders. They dismissed it at the beginning of the RFP bc they thought you needed a Sitecore or an AEM. It took a long time and a lot of agency people to show how safe it is to help make them believe that open source isn’t a dirty word.

Mediacurrent: What current challenges are you trying to solve for?

AO: It is a constant struggle to keep up with Google - making sure our content is optimized for search algorithms. Our overall challenge is to keep our content fresh, navigating innovative best practices for our website while keeping up with legal and social constructs.

Mediacurrent: How are you using Drupal 8 to solve those problems? 

AO: One of the big reasons we chose Drupal was because of its customization ability. Our knowledge base is spread across so many people so Drupal’s ability to customize the backend experience and offer the fields and plain English way we need to talk about things is really important. Even just the simple need for content creators to be able to edit things and be able to customize that experience.

Another big reason was the fact that its open source and the community surrounding Drupal. If you have an idea you can find someone who has half baked or full-baked into that particular module or idea to help give your devs a headstart solution. With Drupal, you don’t have to start from scratch when you need something new to move the website forward. Chances are, someone has had a similar idea you can pull from.

Mediacurrent: Has this been your first experience with Drupal or have you worked with previous versions of Drupal in the past? What did Drupal 8 give you from a marketers/content editors perspective?

AO: I came to SRA on a proprietary healthcare based CMS. It was designed to serve mid to small hospital systems and we didn’t have access to the backend part of the site before. SRA put out an RFP for a replatforming and redesign of our website . We talked to different agencies, and Drupal kept coming up - there were no licensing fees with open source. The spin up on Drupal is more robust than most paid CMS experiences. The cost point of view is having it be free and open was very appetizing and Drupal had other features that appealed to us. 

Mediacurrent: Since launching on Drupal 8 have you noticed an increase in website conversions?  What would you attribute to that success (or lack of success)? By use of marketing automation strategies? Bc of easy integration?

AO: Drupal can be leveraged any which way you want it to be. We take advantage of the extensive list of modules. We have seen nice conversions off the YAML module & the webform module. It’s true of the module philosophy to be able to build how you want them too. 

With Drupal, our web traffic has been up. We have 3 very different facets of our site - rehab measures database, research educational platform, home site - and Drupal can support them all very well. It’s a testament to Drupal - with a flexible CMS, reporting, user interfaces, and a back end that can be robust enough to bring things together in an organic and seamless way. 

Mediacurrent: What are 3 factors you look at when evaluating an agency? Cost? Reputation? Their own web design? Logos they've sold? 

AO: With our RFP out, we began evaluating the superficial - books, examples, case studies, white papers, if their leadership had given talks and what they had talked about, the look and feel for brand consciousness, - exploring that space of ability. We didn’t want someone who was making cookie cutter websites and we didn’t want to stay looking just in the healthcare vertical. Our list was narrowed down to those whose work we respected and admired. 

In the RFP, the CMS wasn’t a consideration. We didn’t tell people which platform you needed to be on. We asked for the cost, their preferred CMS and why, and we never cared about where the agency was located. It’s important to know the the people are the agency - communication is critical. For instance, in their responses to those RFP’s are there timelines? Are they realistic? Do they make sense? It’s easy to see how much effort they did.

No one else did research like you guys [Mediacurrent] did before they got there for a face to face meeting. Your team said “oh, well we’ve already talked to discharge managers, nurses, planners.” They went through example personas, guessing on journeys, patients - and they were smart with how they handled it and took the initiative that early in the process. That showed us a lot about them. It wasn’t a giant new business budget and they didn’t ask for money up front. 

In all, the RFP process was about 4 months.

Mediacurrent: As a marketer using Drupal, what are some of the hot topics you'd like to know more about today? Personalization, marketing automation, etc.

AO: I’d like to know more about:

  • Integrations with personalization
  • Integrating with Google Analytics, tracking to AEM, adwords, & api that moves page data to backend sites
  • Marketing Automation capabilities

Mediacurrent: What advice would you give other CMO’s/VP’s/Director’s who are hesitant to move to Drupal 8?

AO: I would say it depends on what their hesitation is. You have to be committed to the build of your site. You need to be able to really understand your content creators, the users of your CMS, the scope of what they want to be doing, and understand what they could be doing on the front end. It’s important to know the ingredients - you can muck up Drupal and waste dev hours if you don’t know how the workflows to go and to know your taxonomy and pathing modules. 

Drupal requires a Digital Marketer to have a vision for what they want it to be before they start developing - or else they risk having to go back and retrofit into their CMS environment that they could have efficiently put in the first time.

The journey of CMS and Drupal needs to be a thoughtful one.

______________________________________________________

We want to extend a big THANK YOU to Alan for participating in this interview. In the next part of the blog series, we will dig into the top reasons for Drupal 8 and why enterprise marketers choose Drupal.

Mediacurrent: Break it Down For Me, Shrop: Tackling Drupal Security Update SA-CORE-2018-005

August 8, 2018 - 19:29

Security maintenance — and the ability to apply security updates quickly — is part and parcel to open source project success. 

Updating is typically done as part of the normal software release cycle, however, there are times when a security advisory needs to be released ASAP. A strong incident response plan builds a first defense line to mitigate and patch vulnerabilities. 

But what does a successful security response look like in action?

On the heels of a recent Drupal security update on August 1, 2018, Mediacurrent’s Senior Project Manager Christine Flynn had the same question. To find out, she interviewed our Open Source Security Lead, Mark “shrop” Shropshire, to get a layperson’s perspective on the security team’s approach.

 

“An off-cycle Drupal security advisory dropped on August 1, 2018. What does that mean for folks who aren’t developers?”

Flynn: I was watching the Slack channel as our team fixed sites, and I got some idea of what was happening. I’m not going to jiggle anybody’s elbows while they’re applying a security update, but I’m really curious now that the fixes are all in. 

Shrop: The official Drupal Security Advisory came out late in the day, after Symphony published their announcement in the morning. There was also one from Zend.

Flynn: I read all of those links while the team was applying the security update, but I feel like I didn’t totally understand the implications. I’d love to get a better picture from you of what they mean.

Shrop: You bet! I hope you can hear me, I’m at a coffee shop right now.

Flynn: Are you on their unsecured WiFi?

Shrop: Nope! I’m on a hotspot and on VPN. It’s funny, the more you know about security, the more it changes what you do. Other people think you’re paranoid. But you’re not! You just understand the realities. 

Flynn: Ha! Why am I not surprised? All right, let’s dig in.

“What was the security update for?”

Shrop: Drupal Core was updated because there were some security releases for Symfony. We call those “upstream” in the biz, which means that Drupal depends on them, and they are actively worked on outside of Drupal. I understand the Symfony project worked closely with the Drupal Security Team to make sure Symfony and Drupal were both updated and ready to be announced publicly at the same time. Drupal version 8.5.6 pulls in the Symfony updates as part of the Drupal update process. 

Flynn: Was that the only update?

Shrop: No, at the same time, there was also an update to Zend Framework, but that was only an issue for users who were making use of modules or sites that used Zend Feed or Daictoros. There is a core issue to update the related Zend libraries for those who require or need the updates. 

“If not updated, what could a malicious user do to a site?”

Shrop: This is a hard one to answer this soon after the release of the security advisory. I’m going to do some checking to see if I can get more information on this for academic purposes, but the Drupal Security Team is not going to make any statements that could help someone attack a site. It is up to security teams and researchers to dig into the code and determine more about the risks involved.

Based on the Symfony project’s blog post, it appears that a specially crafted request could allow a user access to a URL they do not have access to, bypassing access control provided by web servers and caching mechanisms. That’s a fancy-pants way of saying that a website visitor could gain access to pages you don’t want them to see.

“When will we know more?”

Shrop: Within days - sometimes hours - we might start to see exploit methods posted on the Internet. Taking security seriously and responding quickly once a drupal.org security advisory is announced is a way to stay ahead of these concerns.

Mediacurrent doesn’t want to fearmonger, but it is better to be safe than sorry. That’s why I always push to update as soon as possible while weighing in on mitigating factors that may lessen the severity of the issue for a particular application. But I will keep digging. I’m curious! 

“If you had to tell a CEO or CFO the value that implementing this security update swiftly provided, what would you say? Let’s say this CEO does not have a strong background in technology or security.”

Flynn: I could see an executive with a strong public safety or physical security background being pretty understanding of why you want to apply a security update for a potential vulnerability quickly, but what if it’s someone who doesn’t have that experience, and isn’t a technologist?

Shrop: Check out this link from Acquia about the security update. This helped me so much. They published this shortly after the PSA came out, and although they’ve updated the text since then, they said at the time, “It is advised that customers set aside time for a core upgrade immediately following.” When I read, “immediately,” I knew that we had to get the update out within hours. If I was asked to get on a call with the executives from any company, at that point, I am confident. If Acquia is saying it, we need to do it. That’s enough to stand on with anybody. I’m not saying that the Acquia team has more information, but they have a very robust security team. They always dig in quickly. They have to, to know if they can mitigate the issue by adding web application firewall rules.

Flynn: Firewall rules? How does that work? 

Shrop: The last few core updates, Pantheon and Acquia put mitigations into their WAF - that’s Web Application Firewall. Pantheon confirmed the night of the security advisory release that they were blocking attempts on their platform, and Acquia did the same thing. So if someone tried to exploit a site that was hosted there before Drupal was updated, they were there, helping to prevent that site from being attacked successfully. It’s a great extra layer of protection. Now, me and Acquia and Pantheon will always still want to update Core on each site, because WAF-level mitigation might not catch everything. But I am super happy when I see it because there’s a good chance that it will catch anything that happens while a team is still implementing a security update.

Security is all risk assessment and mitigation. You want to layer defenses. And something like this, we are going to make sure we deal with this problem. That’s why Acquia, Pantheon, Platform.sh, and others in the community immediately add those extra mitigations to their firewalls. It’s to buy time so that people can get their updates in. That’s not where mitigation ends, but it helps. 

“What type of sites were affected by this? Does everyone use Symfony?”

Flynn: When I first read about the upcoming security advisory, I saw that it affected “third party libraries.” That made me think that some of our clients might not be affected because it would only affect certain modules. Can you tell me what types of sites were affected?

Shrop: Got a link for you, but basically, anything on Drupal 8 was affected. Drupal 8 uses components from the Symfony project. The Drupal community made the decision to use Symfony so that we didn’t have to maintain everything ourselves. So this is a great example of the power of open source, with the Symfony and Drupal security teams working together to release this fix. We all end up benefiting from having a larger community to fix issues. There’s no way an internal team working by themselves can write as secure applications on their own compared to open source software, in my opinion. It has nothing to do with how good you are, it’s the nature of development. With open source, you have a greater team with Drupal and then again, with Symfony, an even greater team to lean on. With each community that is included you are expanding your team and your ability to detect and prevent threats. 

“How was the security vulnerability discovered?”

Shrop: That’s generally never disclosed because you never want to tell malicious users how you found an opening. 

But we do have a few people to thank: Michael Cullum and @chaosversum were thanked by Symfony for separately reporting the two issues addressed in Symfony security releases. They also thanked Nicolas Grekas for implementing the fix. I would also give a huge thanks to Symfony and the Drupal Security Team for coming together to implement the fix and for coordinating the announcements. It’s hard work, and it shows the community at its best.

“So when we have an off-cycle security release, first the PSA comes out. Can you tell me a bit about what Mediacurrent does from the time the PSA comes out to just before the security advisory drops?”

Flynn: As someone on the team at Mediacurrent, I can see some of the things you do. But I’m wondering what else happens behind the scenes? 

Shrop: The first thing that happens is that I’m notified about the PSA coming out. I’m signed up for updates via email, Twitter, and RSS feeds from https://www.drupal.org/security, and so are a lot of other folks at Mediacurrent. Internally, we have some processes that we have standardized over time for how to deal with security updates that we follow across the company. We centralize information we have on the security PSA/advisory, recommend client communications, and talk about how to prepare as a team. We have multiple communication threads internally, as well, so no one can miss it. I send an email to the staff and I post in our Slack in a few places to get us ready.

Flynn: I know that we often clear time in advance for the team to implement the security updates.

Shrop: Yep. All of us share more information as a team as official information is released or as our own investigations reveal information. For example, early on the day the security advisory was released, our DevOps Lead, Joe Stewart, noticed that Symfony had put out a notice that they were also going to be releasing a security update that day, so that gave us a heads up that it might be related. We couldn’t know for sure until the security advisory actually came out, though. No one can do it by themselves, which is why we have a whole team working on it - it’s the only way to handle these things. ​​​​​​

“So then the security advisory drops. How did we go about fixing the issue?” 

Shrop: First, we reviewed the advisory to assess risk and for any mitigations that help determine how quickly we need to perform updates. With this advisory, it was needed pretty much immediately, so we started to update Drupal core for our clients and pushed to test environments. Our QA team performed regression testing related to the update. Once QA approved each update for each client, we worked with folks to approve the updates and release them to the live environments. 

The important points are to line everyone and everything up in advance, have the talent in-house who can work on clients of all shapes and sizes and needs, and then to work as a team to resolve the issue on every client site as quickly as possible. 

“Were there any sites that were trickier to update? Why?”

Shrop: Clients that were on older versions of Drupal Core, who had delayed upgrading, were harder to update. Every site was updated within a short time, regardless, but even though they started at the same time, those clients did not finish first, because there was more development and testing needed on each site.

Flynn: What was different about the process to update those sites? 

Shrop: If a client wasn’t on version 8.5.x, the lead technologist on the project had to work on an alternative update to secure the site or application, since there wasn’t a security update released for it. Figuring out an alternative process on the fly always introduces risk. It’s part of the value that we bring, that we have team members that have the expertise to evaluate that sort of thing. For example, we had one new client that was on an older version of Drupal 8 core. So one of our Senior Drupal Developers, Ryan Gibson, had to go in and determine what to do. He ended up updating Symfony itself to mitigate the risk. 

Flynn: I’m guessing that we are going to recommend to that client that we update Drupal core for them very soon?

Shrop: Yes. The big takeaway is you’re lowering your risk of problems by staying on the most recent, up-to-date minor version of Drupal 8. Version 8.5.x is current and stable right now, so you should be on that.

Flynn: Why would a client not update?

Shrop: There are always dynamics. I hear lots of good excuses, and I’m not exaggerating, they are good, real reasons! The client is busy, the client has multiple workstreams, it’s hard - but it is getting to a point where I want to recommend even more strongly to clients that it is more expensive to not upgrade. It is going to cost them more when there is an update because we have these additional evaluation and update tasks. The whole point of Drupal 8’s release cycle is to spread the maintenance cost over years rather than getting hit all at once. 

Flynn: And it introduces greater risk. A security breach is an order of magnitude more expensive than extra mitigation steps.

Shrop: Definitely.

“When is the next version of Drupal Core coming out?”

Shrop: Version 8.6.0 will be released in September. Our teams are already starting to test the early versions of this release on some of our projects. If a security update comes out in September, we want all of our clients to be prepared by being on the currently supported version of Drupal core. That way, they will receive security updates.

Flynn: One of the nice things about the Drupal development community is that they provide the betas of the next version of Drupal core so you can get ahead of the next release, right?

Shrop: Yes. When the community starts releasing betas or release candidates, especially release candidates, you want to start testing ahead of time. If you have a Drupal site, you can get your developers to test. If you find a problem, it may not be with your site, it might be an issue with Drupal core and this is a great opportunity to contribute your findings back to drupal.org and help the greater community. There might be a security release weeks after a version comes out and you want to be prepared to implement it.

Flynn: It goes back to risk mitigation.

Shrop: If you are on, say, an 8.2 site right now, you’re on the higher risk side, unfortunately. We advise our clients that it is in their best interest to be on the current, stable version. It costs our clients more in the long run if they don’t update on a steady basis.

Flynn: So if you’re on an older version of Drupal Core, you might not get an easy-to-implement security update when a vulnerability is discovered?

Shrop: The quotes from the Drupal Security team I really want to emphasize are, “Previous minor releases will become unsupported when a new minor release is published,” and, “Any additional security updates for officially unsupported branches are at the sole discretion of the security team.” This is important to understand. For the SA Core 2018-002 fix earlier this year they provided release updates for older versions of Drupal… but they didn’t have to. In the case of the fix last week, they did not.

“What was the best gif exchange of the Drupal core security update process?”

Flynn: I nominate this one, from mid-afternoon:

Shrop: Definitely! 

“What story didn’t we tell yet?”

Shrop: I think we covered most of it. The last thing I’d put out there is for the technical folks reading this. You need to read the security advisories, join Drupal Slack, read what Acquia, Pantheon, and others are saying about each announcement. Then, you take all of that in and make your assessment of what actions you are going to recommend your organization take. This should lead your organization to a documented security plan that you follow. But, you know… 

Flynn: “Update all the things”?

Shrop: Exactly!

Other Resources
7 Ways to Evaluate the Security and Stability of Drupal Contrib Modules | Mediacurrent Pantheon Guest Blog 
Security by Design: An Introduction to Drupal Security | Mediacurrent Webinar

PreviousNext: Encrypted Drupal Database Connections with Amazon RDS

August 8, 2018 - 14:46

Malicious users can intercept or monitor plaintext data transmitting across unencrypted networks, jeopardising the confidentiality of sensitive data in Drupal applications. This tutorial will show you how to mitigate this type of attack by encrypting your database queries in transit.

by Nick Santamaria / 8 August 2018

With attackers and data breaches becoming more sophisticated every day, it is imperative that we take as many steps as practical to protect sensitive data in our Drupal apps. PreviousNext use Amazon RDS for our MariaDB and MySQL database instances. RDS supports SSL encryption for data in transit, and it is extremely simple to configure your Drupal app to connect in this manner.

1. RDS PEM Bundle

The first step is ensuring your Drupal application has access to the RDS public certificate chain to initiate the handshake. How you achieve this will depend on your particular deployment methodology - we have opted to bake these certificates into our standard container images. Below are the lines we've added to our PHP Dockerfile.

# Add Amazon RDS TLS public certificate.
ADD https://s3.amazonaws.com/rds-downloads/rds-combined-ca-bundle.pem  /etc/ssl/certs/rds-combined-ca-bundle.pem
RUN chmod 755 /etc/ssl/certs/rds-combined-ca-bundle.pem

If you use a configuration management tool like ansible or puppet, the same principal applies - download that .pem file to a known location on the app server.

If you have limited control of your hosting environment, you can also commit this file to your codebase and have it deployed alongside your application.

2. Drupal Database Configuration

Next you need to configure Drupal to use this certificate chain if it is available. The PDO extension makes light work of this. This snippet is compatible with Drupal 7 and 8.

$rds_cert_path = "/etc/ssl/certs/rds-combined-ca-bundle.pem";
if (is_readable($rds_cert_path)) {
  $databases['default']['default']['pdo'][PDO::MYSQL_ATTR_SSL_CA] = $rds_cert_path;
}
3. Confirmation

The hard work is done, you'll now want to confirm that the connections are actually encrypted.

Use drush to smoke check the PDO options are being picked up correctly. Running drush sql-connect should give you a new flag: --ssl-ca.

$ drush sql-connect

mysql ... --ssl-ca=/etc/ssl/certs/rds-combined-ca-bundle.pem

If that looks OK, you can take it a step further and sniff the TCP connection between Drupal and the RDS server.

This requires root access to your server, and the tcpflow package installed - this tool will stream the data being transmitted over port 3306. You are wanting to see illegible garbled data - definitely not content that looks like a SQL queries or responses!

Run this command, and click around your site while logged in (to ensure minimal cache hits).

$ tcpflow -i any -C -g port 3306

This is the type of output which indicates the connection is encrypted.

tcpflow: listening on any

x1c
"|{mOXU{7-rd 0E
W$Q{C3uQ1g3&#a]9o1K*z:yPTqxqSvcCH#Zq2Hf8Fy>5iWlyz$A>jtfV9pdazdP7
tpQ=
i\R[dRa+Rk4)P5mR_h9S;lO&/=lnCF4P&!Y5_*f^1bvy)Nmga4jQ3"W0[I=[3=3\NLB0|8TGo0>I%^Q^~jL
L*HhsM5%7dXh6w`;B;;|kHTt[_'CDm:PJbs$`/fTv'M .p2JP' Ok&erw
W")wLLi1%l5#lDV85nj>R~7Nj%*\I!zFt?w$u >;5~#)/tJbzwS~3$0u'/hK /99.X?F{2DNrpdHw{Yf!fLv
`
KTWiWFagS.@XEw?AsmczC2*`-/R rA-0(}DXDKC9KVnRro}m#IP*2]ftyPU3A#.?~+MDE}|l~uPi5E&hzfgp02!lXnPJLfMyFOIrcq36s90Nz3RX~n?'}ZX
'Kl[k{#fBa4B\D-H`;c/~O,{DWrltYDbu
cB&H\hVaZIDYTP|JpTw0 |(ElJo{vC@#5#TnA4d@#{f)ux(EES'Ur]N!P[cp`8+Z-$vh%Hnk=K^%-[KQF'2NzTfjSgxG'/p HYMxgfOGx1"'SEQ1yY&)DC*|z{')=u`TS0u0{xp-(zi6zp3uZ'~E*ncrGPD,oW\m`2^ Hn0`h{G=zohi6H[d>^BJ~ W"c+JxhIu
[{d&s*LFh/?&r8>$x{CG4(72pwr*MRVQf.g"dZU\9f$
h*5%nV9[:60:23K Q`8:Cysg%8q?iX_`Q"'Oj
:OS^aTO.OO&O|c`p*%1TeV}"X*rHl=m!cD2D^)Xp$hj-N^pMb7x[Jck"P$Mp41NNv`5x4!k1Z/Y|ZH,k)W*Y(>f6sZRpYm
8Ph42K)}.%g%M]`1R^'qh/$3|]]y"zEh0xG(A]-I`MJGU7rKO~oi+K:4M(nyOXnvaWP4xV?d4Y^$8)2WOK,2s]gyny:-)@D*F%}ICT
Tu>ofc)P[DQ>Qn3=0^fuefIm1]-YHq5rx|W(S3:~2&*6!O|DAZWB:#n9|09`I`A3bq@\E\$=/L5VHm)[#|tI"lkuK.u|!2MT/@u7u(S{"H.H'Fh/4kF_2{)Jc9NQ%jA_rI1lH;k'$n~M_%t%y)t!C_4FO?idwMB]t^M::S!a=*Jee<3sgX@)L;zAuTN2}v#K4AX.(`X1<{#

Resources:

Tagged MySQL, TLS

Zhilevan Blog: Fix Drupal Files/Directories permissions by PHP after hacked

August 8, 2018 - 11:18
Last night one of our former company's customer called me and need help to recover their hacked website,  First of all, I install the Hacked module, and check the changed files and recover them, then looking and cleansing some backdoor files which their job is to inject codes for external codes(most of the time, js files to traffic hijacking) to the website.  

Drupal.org blog: What's new on Drupal.org? - July 2018

August 8, 2018 - 03:23

Read our Roadmap to understand how this work falls into priorities set by the Drupal Association with direction and collaboration from the Board and community.

Announcements Git remote URL changes for full projects and sandboxes

Git authentication methods for Drupal.org hosted projects are changing as we approach upgrading our developer tooling stack.

In particular we are:

We have updated the version control instructions for Drupal.org projects, and put a message in our Git server for any user who makes a push using the deprecated format.

For more information, please review: https://drupal.org/gitauth

Reminder: Drupal Europe is coming up soon

Drupal Europe is coming up in less than 40 days! Drupal Europe will be the largest gathering of the Drupal community in Europe and is a reimagining of this important community event as both technical conference and family reunion. The Drupal Association engineering team will be attending to connect with the community, provide updates on Drupal.org, and listen to some of the incredible speakers who will be in attendance.

Join the community in Darmstadt, Germany, from September 10-14, 2018. Make sure to register, book your travel, and secure accommodation: http://drupaleurope.org/

We want your feedback on ideas for Drupal Core

The Drupal Association has proposed several initiatives for Drupal Core - but before they can be officially adopted they need feedback from stakeholders in the community (even if it's just a "+1") and to reach community RTBC. Here are the proposals:

Drupal.org Updates Staff retreat

In July the Drupal Association gathered together in Portland Oregon for our bi-annual staff retreat. At these retreats we discuss the progress made in the last six months, and our prioritization as an organization going into the next six month period.

Hightech industry page launched

Drupal is the CMS of choice for a variety of companies in the high tech space, including organizations like Redhat, Cisco, and Tesla. Whether it is used in a front-facing application, as a decoupled back-end, or for an internal intranet experts in hightech defer to Drupal's example for their needs.

We launched a new industry page featuring these stories from high tech in July.

Drupal.org API updated for security advisories

To improve the automated toolchains built by organizations and individuals in the community to watch for new security advisories, we've updated the Security Advisory API. One of these changes ensures that the full canonical identifier for each advisory is included in the API data, which is a small but valuable change for anyone monitoring the API for advisory information.

Social Media Sharing for Events News

The DrupalCon news feed now includes social media sharing icons, so that you can better promote DrupalCon news and announcements to your networks. Word of mouth has always been a critical part of Drupal's success - so we hope that as featured speakers are announced, early bird registration begins, or the schedule is published, you will help us get the word out!

DrupalCon Seattle is coming up from April 8-12 2019, and we're featuring some bold new changes to support a variety of audiences from our traditional core of those people who build Drupal, to marketers and content editors, and to the agency sales forces that sell Drupal to the world.

———

As always, we’d like to say thanks to all the volunteers who work with us, and to the Drupal Association Supporters, who make it possible for us to work on these projects. In particular we want to thank:

If you would like to support our work as an individual or an organization, consider becoming a member of the Drupal Association.

Follow us on Twitter for regular updates: @drupal_org, @drupal_infra

Platform.sh: How micro is your microservice?

August 7, 2018 - 23:02
How micro is your microservice? Crell Tue, 08/07/2018 - 16:02 Blog

"Microservices" have been all the rage for the past several years. They're the new way to make applications scalable, robust, and break down the old silos that kept different layers of an application at odds with each other.

But let's not pretend they don't have costs of their own. They do. And, in fact, they are frequently, perhaps most of the time, not the right choice. There are, however, other options besides one monolith to rule them all and microservice-all-the-things.

What is a microservice?

As usual, let's start with the canonical source of human knowledge, Wikipedia:

"There is no industry consensus yet regarding the properties of microservices, and an official definition is missing as well."

Well that was helpful.

Still, there are common attributes that tend to typify a microservice design:

  • Single-purpose components
  • Linked together over a non-shared medium (usually a network with HTTP or similar, but technically inter-process communication would qualify)
  • Maintained by separate teams
  • And released (or replaced) on their own, independent schedule

The separate teams part is often overlooked, but shouldn't be. The advantages of the microservice approach make it clear why:

  • Allow the use of different languages and tools for different services (PHP/MongoDB for one and Node/MySQL for another, for instance.)
  • Allows small, interdisciplinary teams to manage targeted components (that is, the team has one coder, one UI person, and one DB monkey rather than having a team of coders, a team of UI people, and a team of DB monkeys)
  • Allows different components to evolve and scale scale independently
  • Encourages strong separation of concerns

Most of those benefits tie closely to Conway's Law:

Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure.

A microservice approach works best when you have discrete teams that can view each other as customers or vendors, despite being within the same organization. And if you're in an organization where that's the case then microservices are definitely an approach to consider.

However, as with any architecture there are tradeoffs. Microservices have cost:

  • Adding network services to your system introduces the network as a point of failure.
  • PointS of failure should always be plural, as a network, even a virtual and containerized one, has many, many points of failure.
  • The network will always be 10x slower than calling a function, even a virtual network. If you're using a shared-nothing framework like PHP you have to factor in the process startup cost of every microservice.
  • If you need to move some logic from one microservice to another it's 10x harder than from one library to another within an application.
  • You need to staff multiple interdisciplinary teams.
  • Teams need to coordinate carefully to avoid breaking any informal APIs
  • Coarse APIs
  • Needing new information from another team involves a much longer turnaround time than just accessing a database.

Or, more simply: Microservices add complexity. A lot of complexity. That means a lot more places where things can go wrong. A common refrain from microservice skeptics (with whom I agree) is

"if one of your microservices going down means the others don't work, you don't have a microservice; you have a distributed monolith."

To be sure, that doesn't mean you shouldn't use microservices. Sometimes that is the right approach to a problem. However, the scale at which that's the is considerably higher than most people realize.

What's the alternative?

Fortunately, there are other options than the extremes of a single monolith and a large team of separate applications that happen to talk to each other. There's no formal term for these yet, but I will refer to them as "clustered applications".

A clustered application:

  • Is maintained by a single interdisciplinary team
  • Is split into discrete components that run as their own processes, possibly in separate containers
  • Deploys as a single unit
  • May be in multiple languages but usually uses a single language
  • May share its datastore(s) between processes

This "in between" model has been with us for a very long time. The simplest example is also the oldest: cron tasks. Especially in the PHP world, many applications have had a separate cron process from their web request/response process for literally decades. The web process exists as, essentially, a monolith, but any tasks that can be pushed off to "later" get saved for later. The cron process, which could share, some, all, or none of the same code, takes care of the "later". That could include sending emails, maintenance tasks, refreshing 3rd party data, and anything else that doesn't have to happen immediately upon a user request for the response to be generated.

Moving up a level from cron are queue workers. Again, the idea is to split off any tasks that do not absolutely need to be completed before a response can be generated and push them to "later". In the case of a queue worker "later" is generally sooner than with a cron job but that's not guaranteed. The workers could be part and parcel of the application, or they could be a stand-alone application in the same language, or they could be in an entirely different language. A PHP application with a Node.js worker is one common pattern, but it could really be any combination.

Another variant is to make an "Admin" area of a site a separate application from the front-end. It would still be working on the same database, but it's possible then to have two entirely separate user pools, two different sets of access control, two different caching configurations, etc. Often the admin could be built as just an API with a single-page-app frontend (since all users will be authenticated with a known set of browser characteristics and no need for SEO) while the public-facing application produces straight HTML for better performance, scalability, cacheability, accessibility, and SEO.

Similarly, one could make a website in Django but build a partner REST API in a separate application, possibly in Go to squeeze the last drop of performance out of your system.

There's an important commonality to all of these examples: Any given web request runs through exactly one of them at a time. That helps to avoid the main pitfall of microservices, which is adding network requests to every web request. The fewer internal IO calls you have the better; just ask anyone who's complained about an application making too many SQL queries per request. The boundaries where it's reasonable to "cut" an application into multiple clustered services are anywhere there is, or can be, an asynchronous boundary.

There is still additional complexity overhead beyond a traditional monolith: while an individual request only needs one working service and there's only one team to coordinate, there's still multiple services to have to manage. The communication paths between them are still points of failure, even if they're much more performance tolerant. There could also be an unpredictable delay between actions; an hourly cron could run 1 minute or 59 minutes after the web request that gave it an email to send. A queue could fill up with lots of traffic. Queues are not always perfectly reliable.

Still, that cost is lower than the overhead of full separate-team microservices while offering many (but not all) of the benefits in terms of separation of concerns and allowing different parts of the system to scale and evolve mostly independently. (You can always throw more worker processes at the queue even if you don't need more resources for web requests.) It's a model well worth considering before diving into microservices.

How do I do either of these on Platform.sh?

I'm so glad you asked! Platform.sh is quite capable of supporting both models. While our CPO might yell at me for this, I would say that if you want to do "microservices" you need multiple Platform.sh projects.

Each microservice is supposed to have its own team, its own datastore, its own release cycle, etc. Doing that in a single project, with a single Git repository, is rather counter to that design. If your system is to be built with 4 microservices, then that's 4 Platform.sh projects; however, bear in mind that's a logical separation. Since they're all on Platform.sh and presumably in the same region, they're still physically located in the same data center. The latency between them shouldn't be noticeably different than if they were in the same project.

Clustered applications, though, are where Platform.sh especially shines. Every project can have multiple applications in a single project/Git repository, either in the same language or different language. They can share the same data store or not.

To use the same codebase for both the web front-end and a background worker (which is very common), we support the ability to spin up the same built application image as a separate worker container. Each container is the same codebase but can have different disk configuration, different environment variables, and start a different process. However, because they all run the same code base it's only a single code base to maintain, a single set of unit tests to write, etc.

And of course cron tasks are available on every app container for all the things cron tasks are good for.

Within a clustered application processes will usually communicate either by sharing a database (be it MariaDB, PostgreSQL, or MongoDB) or through a queue server, for which we offer RabbitMQ.

Mixing and matching is also entirely possible. In a past life (in the bad old days before Platform.sh existed) I built a customer site that consisted of an admin curation tool built in Drupal 7 that pulled data in from a 3rd party, allowed users to process it, and then exported pre-formatted JSON to Elasticsearch. That exporting was done via a cron job, however, to avoid blocking the UI. A Silex application then served a read-only API off of the data in Elasticsearch, and far faster than a Drupal request could possibly have done.

Were I building that system today it would make a perfect case for a multi-app Platform.sh project: A Drupal app container, a MySQL service, an Elasticsearch service, and a Silex app container.

Please code responsibly

There are always tradeoffs in different software design decisions. Sometimes the extra management, performance, and complexity overhead of microservices is worth it. Sometimes it's... not, and a tried-and-true monolith is the most effective solution.

Or maybe there's an in-between that will get you a better balance between complexity, performance, and scalability. Sometimes all you need is "just" a clustered application.

Pick the approach that fits your needs best, not the one that fits the marketing zeitgeist best. Don't worry, we can handle all of them.

Larry Garfield 7 Aug, 2018

Community: Governance Task Force Community Update, August 2018

August 7, 2018 - 22:40

This is a public update on the work of the Governance Task Force.

We have progressed into what we are calling the “Engagement Phase” of our schedule; interviewing community member, working groups, and soliciting feedback and meetups and camp. To date we have interviewed at least 18 people (including community members, liaisons, and leadership,) and 3 groups, with at least 15 more being scheduled.

Interviews

If you would like to participate in an interview, please contact any member of the Governance Task Force or sign up using this Google form.

The purpose of interviews is to meet with people individually to get feedback and ideas, and have a conversation about community governance (non-technical governance.) Many governance related discussions have occurred in the past, but we want to make sure everyone has an opportunity to be heard, since group discussions are not always conducive to individual thoughts. Notes taken during the interview are available to, and editable by, the person interviewed, and not shared outside the Governance Task Force. If you have any concerns about a language barrier, privacy, or any other concerns about participating in an interview, contact us. We will do what we can to work with you.

Analysis

The individual interviews are a new step in this governance process, but we do have access to a lot of information that was already compiled from prior discussions. Many town hall style discussions were held over the past year, and we are using all of that information. As we progress into the “Analysis Phase” we are using that information to develop user stories and ideas that will help inform our eventual proposal. Once the interviews are concluded, their analysis will be merged with the existing information.

Drupal Europe

Rachel, Ela, and Stella will be providing an update on the task force’s efforts at Drupal Europe. Findings will be shared and there will be an open discussion to hear from attendees to inform our efforts.

Ongoing Feedback

The task force is committed to working transparently and delivering a well-rounded proposal for both the community and for leadership to review. We believe the proposal presents a great opportunity to help evolve community governance and inform next steps. Should you want to contact the Governance Task Force, feel free to reach out to any member of the group via Slack, drupal.org, or any public place you find our members.

We’ve also setup a Google form for direct feedback. If you do not want to participate in an interview, but do want to contribute your individual thoughts, use this form. You can answer as many or few questions you like. You can also submit the form anonymously. This form will stay active throughout the proposal process, so if you have thoughts to share at a later date, you can still use this form.

Adam Bergstein
David Hernandez
Ela Meier
Hussain Abbas
Lyndsey Jackson
Rachel Lawson
Stella Power

Ixis.co.uk - Thoughts: Last month in Drupal - July 2018

August 7, 2018 - 22:22
July has been and gone so here we take a look back at all the best bits of news that have hit the Drupal community over the last month. Drupal Development Dries Buytaert discussed why more and more large corporations are beginning to contribute to Drupal. He shares an extended interview with Pfizer Director Mike Lamb who explains why his development team over there has ingrained open source contribution into the way they work. Drupal 8.5.5 was released in July, this patch release for Drupal 8 contained a number of bug fixes, along with documentation and testing improvements.  It was announced that Drupal 8.6.0 will be released on September 5th and the Alpha version was released the week beginning July 16th. The beta was also recently released, the week of July 29th. This release will bring with it a number of new features, Drupal released a roadmap of all the fixes and features they aim to have ready for the new release.  Events Drupal Europe announced 162 hours of sessions and 9 workshops for the event on Tuesday, Wednesday and Thursday. They also urge anyone with any ideas for social events at this year's event to submit your ideas to help fill out the social calendar with community led ideas.  On August 17-19, New York will play host to the second Decoupled Drupal days. For those that don’t know Decoupled Drupal Days gathers technologists, marketers and content professionals who build and use Drupal as a Content Service -- for decoupled front ends, content APIs, IoT, and more.  DrupalCamp Colorado recently took place. The event proved popular as per usual and this year's Keynote “The Do-ocracy Dilemma and Compassionate Contribution”, was delivered by Acquia Director of research and innovation, Preston So. Preston discusses why a more compassionate approach to contribution is so critical when it comes to managing open-source projects, crafting conference lineups, enabling a successful team, and building a winning business. New Modules New modules, updates and projects were of course released throughout July, the pick of the bunch includes: Commerce 8.x-2.8 - E-commerce suite sees a number of bug fixes google_analytics 8.x-2.3 - Module sees a number of bug fixes Drupal 8.5.5 - Patch release that sees a number of bug fixes and testing improvements That is the end of this months round up. Keep an eye out for next months where we cover all the latest developments in the Drupal community and all the important news affecting the wider Drupal community. Miss last months round up? Check it out on the Ixis site now.

Amazee Labs: Transparent Database Sanitization with GDPR-dump

August 7, 2018 - 19:12
Transparent Database Sanitization with GDPR-dump

With GDPR in full effect, sanitization of user data is a fairly hot topic. Here at Amazee we take our clients and our clients’ clients privacy seriously, so we have been investigating several possible approaches to anonymizing data.

In the Drupal world, and the PHP world more generally, there are several options available. Here, though, I’d like to discuss one we think is particularly cool.

Blaize Kaye Tue, 08/07/2018 - 14:12

At Amazee Labs’ Global Maintenance, we work with several different projects per day. We move data from our production to staging and dev servers, and from our servers to our local development environments. Especially on legacy systems, site-specific configuration details often exist only in the databases, and even if that weren’t the case, the issues we’re investigating routinely require that we dig into the database as it (more or less) is on the production servers. Anonymization is crucial for our day to day work.

So our considerations here are, how do we balance productivity while keeping things anonymous?

One way of achieving this is to make Anonymization transparent to the developer. Essentially, we want our developers to be able to pull down the live database as it exists at the moment that they pull it down, and have it be anonymized.

How can we achieve this?

Well, one way is to analyse the daily workflow to see if there are any points at which the data has to flow through before it reaches the developer?

It turns out that, if you’re working with mysql, this “final common path” that the data flows through is the mysqldump utility.

If you’re running backups, chances are you’re using mysqldump.

If you’re doing a drush sql-sync there’s a call to mysqldump right at the heart of that process.

Mysqldump is everywhere.

The question is, though, how do we anonymize data using myqldump?

The standard mysqldump binary doesn’t support anonymization of data, and short of writing some kind of plugin, this is a non-starter.

Fortunately for us, Axel Rutz came up with an elegant solution, namely, a drop in replacement for the mysqldump binary, which he called gdpr-dump. A few of us here at Amazee loved what he was doing, and started chipping in.

The central idea is to replace the standard mysqldump with gdpr-dump so that any time the former is called, the latter is called instead.

Once the mysqldump call has been hijacked, so to speak, the first order of business is to make sure that we are actually able to dump the database as expected.

This is where mysqldump-php comes in. It’s the library on which the entire gdpr-dump project is based. It provides a pure PHP implementation of mysqldump as a set of classes. On its own, it simply dumps the database, just as the native mysqldump cli tool does.

A great starting point, but it only gets us part of the way.

What we’ve added is the ability to describe which tables and columns in the database being dumped you would like to anonymize. If, for instance, you have a table describing user data with their names, email, telephone numbers, etc. You can describe the structure of this table to gdpr-dump and it will generate fake, but realistic looking, data using the Faker library.

This requires some upfront work, mapping the tables and columns, but once it is done you’re able to call mysqldump in virtually any context, and it will produce an anonymized version of your database.

There is still a lot of thinking and work to be done, but we think it’s worth investing time in this approach. The fact that it can be used transparently is its most compelling aspect - being able to simply swap out mysqldump with gdpr-dump and have the anonymization work without having to change any of the dependent processes.

If any of this piques your interest and you’re looking for more details about how you might be able to use gdpr-dump in your own workflow, feel free to check out the project (and submit PRs): https://github.com/machbarmacher/gdpr-dump.

ADCI Solutions: Visual regression testing with BackstopJS

August 7, 2018 - 17:26

The larger a project, the more time you will spend on regression testing after each change. But there are a lot of tools which can help you to reduce efforts for this process. One of them is BackstopJS.

Get acquainted with BackstopJS

OSTraining: The Ultimate Tutorial for Drupal's Paragraphs Module

August 7, 2018 - 00:25

Over the last few months we've worked with more and more Drupal 8 sites. Those projects all had one thing in common ... they used the Drupal Paragraphs module.

Paragraphs is a very popular module for handling content in Drupal 8.

Paragraphs works in a similar way to content fields, but also provides a wide range of options for the design, layout and grouping of your content.