Forge Canvas Template: Product repurchase reminder (with timing and channel testing)
Hello fellow Braze orchestrators, I didn't make it to Forge in person this year, but it was great tuning in to the keynotes and catching up on all the latest Braze innovations. 🔥 Here is a quick Canvas that we are about to launch to drive repurchase of two of our products that customers need to by on a regular basis. Entry criteria: purchase of a certain SKU Experimentation (both set up as Personalized Path): Message timing (30 days vs 45 days after the purchase) Message channel (email vs SMS, in this case) Message sequence: A user gets a reminder to repurchase the product she might be running low on; If a user engages with the message from step 1, but does not purchase, he receives a reminder 3 days later with a special offer, to incentivize the purchase. Let me know if you see any opportunities for improvement. I imagine, soon we will be able to simplify this flow by leveraging all the upcoming Braze AI tools. Cheers, Olga Baker Charlie Banana / P&G29Views4likes1CommentForge Canvas Template: Frequency/Channel mix Experiment Setup”
This Canvas serves as a starting point for setting up an experiment aimed at sending specific sub-groups of a user population down the engagement journey that has the highest likelihood of converting them. The criteria that differentiate the journey in this Canvas are tied to two variables: 1. Frequency: n times over n periods depending on the use case, industry, and type of product. For example, n times per week for n weeks. Both the 'times' and 'periods' should be customized according to the use case. 2. Channel mix: combinations of at least 2 channels based on the use case, industry, and type of product. For example, a push notification followed by an email or an IAM followed by an email. The channels to test should be customized based on the use case. The goal here is to leverage Braze's personalization systems to take experiments to the next level: no more simple A/B tests where all users are sent down the engagement journey that converts the most overall. Instead, specific sub-groups of the population will be sent down different, specific paths based on their unique characteristics/preferences. This allows us to avoid a 'one size fits all' approach and instead create multiple journeys that users can experience based on what they prefer (and what tends to convert users like them better 😉 )." I’ve been using this setup for a while, and the uplift compared to a single experience for the entire population (chosen after multiple iterations of a classic A/B test) is significant. Let me know what you think and if you have any ideas for improving this setup!40Views2likes1CommentConversion Rate over 100%?
Hi, I'm struggling to understand the performance of one of our Canvases. I have both the conversion event and exit criteria as "any purchase", and users cannot re-enter the canvas. When I analyze the variants, I see more conversions than entries, leading to a conversion rate over 100%, which shouldn't be possible. Does anybody know what could be going wrong here? thanks!76Views0likes5CommentsThe Personalized Variation Test in Campaign
We are doing our first A/B test out of Braze and the client wants to utilize the Personalized Variation test vs. the Winning Variation test. Now since we don't have many campaign history to go by since we just started deploying out of Braze, does that matter if we still run the Personalized Variation test? Does it go by past history characteristics OR solely similar characteristics on the initial test?154Views0likes0CommentsQuestion on testing for BlaBlaCar - Test and Learn AMA
GuendalinaIn the intro to this AMA session, I saw that you've been testing different "From" names in different markets. We've done this as well in multiple markets across Europe and North America. I'm curious to learn which types of names worked better than others, how many variations you tested and how long the tests lasted for.381Views1like2CommentsIdeas for email open and click through rate testing
Hi all Just wondering what sort of tests others use to test open rate and click-through rate. For example: - Subject lines - Time of day - CTA wording - Localisation Would love to gather a laundry list of ideas to experiment with. 📃😁195Views0likes1CommentMeasuring Primary Conversion (A) and Conversion B in Canvas
Hi All, Hope you are all doing well. Had a query around tracking for Conversion Events A and B in the canvas. If I have my primary conversion defined as " sales form completed" and the secondary as "sales form started", would the same users appear potentially in both conversion events or only unique users are accounted for (once) across both conversion events A and B? Cheers RajSolved298Views0likes1CommentHow long do you run A/B or similar tests and why? Use cases?
Hi all, Curious about how long you run A/B tests, or any other tests for before making a call on the data. I imagine it varies with the use case and would be keen to hear other's thoughts and scenarios. Thanks 😊Solved484Views0likes3CommentsAction, personalisation, connection layers etc. - What are they in Braze?
Hello everyone! What do these different layers (connection, personalisation and action) of the Braze tech stack correspond to in terms of functions/features? And how do these layers interact with each other in different scenarios? I know this last question (on the sequence of a message through the tech stack) is one of the questions in the study guide for the admin certificate and it caught me completely off guard. Been looking throughout the documentation and have pieced together what the other layers "are" but can't find definite answers about these 3 or how they all interact in common scenarios. Would appreciate any insight!Solved297Views0likes1Comment